Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Intel Games Hardware

CPUs Do Affect Gaming Performance, After All 220

crookedvulture writes "For years, PC hardware sites have maintained that CPUs have little impact on gaming performance; all you need is a decent graphics card. That position is largely supported by FPS averages, but the FPS metric doesn't tell the whole story. Examining individual frame latencies better exposes the brief moments of stuttering that can disrupt otherwise smooth gameplay. Those methods have now been used to quantify the gaming performance of 18 CPUs spanning three generations. The results illustrate a clear advantage for Intel, whose CPUs enjoy lower frame latencies than comparable offerings from AMD. While the newer Intel processors perform better than their predecessors, the opposite tends to be true for the latest AMD chips. Turns out AMD's Phenom II X4 980, which is over a year old, offers lower frame latencies than the most recent FX processors."
This discussion has been archived. No new comments can be posted.

CPUs Do Affect Gaming Performance, After All

Comments Filter:
  • by locopuyo ( 1433631 ) on Thursday August 23, 2012 @06:01PM (#41102273) Homepage
    In StarCraft 2 my CPU is the bottleneck.
  • Err (Score:5, Interesting)

    by bhcompy ( 1877290 ) on Thursday August 23, 2012 @06:03PM (#41102307)
    Which idiot made that claim? Pretty much every hardware review site has CPU and GPU dependent games in their reviews when they review GPUs, CPUs, and OTB rigs.
  • What. What?! (Score:5, Interesting)

    by RyanFenton ( 230700 ) on Thursday August 23, 2012 @06:10PM (#41102403)

    Who thought that CPU's didn't bottleneck gaming performance? Who ever thought that? Only the smallest of tech demos only used GPU resources - every modern computer/console game I'm aware of uses, well, some regular programming language that needs a CPU to interpret instructions and is inherently limited by the standards of clock cycle and interrupt tied to those CPUs.

    GPUs only tend to allow you to offload the strait-shot parallelized stuff - graphic blits, audio, textures & lighting - but the core of the game logic is still tied to the CPU. Even if you aren't straining the limits of the CPU in the final implementation, programmers are still limited by the capacity of them.

    Otherwise, all our games would just be done with simple ray-traced logic, using pure geometry and physics, there would be no limits on the number or kind of interactions allowed in a game world, game logic would be built on unlimited tables of generated content, and we'd quickly build games of infinite recursion simulating all known aspects of the universe far beyond the shallow cut-out worlds we develop today.

    But we can't properly design for that - we design for the CPUs we work with, and the other helper processors have never changed that.

    Ryan Fenton

  • For years? (Score:5, Interesting)

    by _Shorty-dammit ( 555739 ) on Thursday August 23, 2012 @06:11PM (#41102413)

    I don't recall ever reading on any PC hardware site anyone claiming that the CPU doesn't matter and all you need is a good graphics card. How on earth did anyone ever successfully submit that story?

  • by WilliamGeorge ( 816305 ) on Thursday August 23, 2012 @06:13PM (#41102441)

    The research into frame-rate latencies is really interesting, but the whole idea that *anyone* knowledgeable about PC gaming would have *ever* denied that the CPU was an important factor in performance is ridiculous. I am a consultant at a boutique PC builder (http://www.pugetsystems.com/) and I have always told gamers they want to get a good balance of CPU and GPU performance, and enough RAM to avoid excessive paging during gameplay. Anything outside of that is less important... but to ignore the CPU? Preposterous!

    Then again, it is a Slashdot headline... I probably should expect nothing less (or more)!

  • by towermac ( 752159 ) on Thursday August 23, 2012 @06:34PM (#41102691)

    Hm. First there is:

    "...The FX-4170 supplants a lineup of chips known for their strong value, the Athlon II X4 series. Our legacy representative from that series actually bears the Phenom name, but under the covers, the Phenom II X4 850 employs the same silicon with slightly higher clocks."

    and then:

    "Only the FX-4170 outperforms the CPU it replaces, the Phenom II X4 850, whose lack of L3 cache and modest 3.3GHz clock frequency aren't doing it any favors."

    How can I trust them if they are unaware of basic stuff any chip enthusiast should know? (The Phenom is the Athlon with level 3 cache. The Athlon has none.) They could have also touched on what the 2 AMD specific hotfixes were for.

    I'm not shocked at the results, but I am skeptical of the degree of disparity.

  • Re:Err (Score:5, Interesting)

    by Sir_Sri ( 199544 ) on Thursday August 23, 2012 @06:36PM (#41102725)

    If you read the charts the assertion that 'cpu doesn't matter' is kind of true in a lot of cases.

    It's not that it doesn't matter at all, but the difference between an 1100 dollar sandy bridge i7 3960 and a 200 dollar 2500k, even though they are almost a factor of 2 difference in performance side by side (http://www.cpubenchmark.net/high_end_cpus.html) is less than 10% in games. Now those processors are still *way* better than the AMD offerings unfortunately, and the AMD processors are in many cases so bad that becomes the dominant problem.

    The new "bulldozer" architecture from AMD is a disaster, in just about every way. They're terrible. The charts clearly show that.

    The video card makers (more than the review sites) have correctly pointed out that performance is much more likely to be GPU gated than CPU gated, or, if it's a problem like I'm working on now, it's a single CPU gated for an algorithm that doesn't neatly parallelize - so more cores doesn't do anything. If you're given a choice between a 1000 dollar CPU or a 600 dollar one from the same company odds are you won't be able to tell the difference, so in that sense they're reasonably correct, there's virtually no benefit to buying an extreme CPU or the like if your primary goal is gaming performance. If you're talking about the best use of say 1000 dollars to build a gaming PC, well then the cheapest i5 you can find with the best video card you can afford is probably the best bang for your buck.

    As someone above said, an RTS like starcraft is more likely to be CPU limited than GPU limited.

    What this tells us is that AMD processors are terrible for gaming, but there's virtually no difference which FX processor you buy (don't buy those though, if you're buying AMD buy a phenom), and within the Intel family there is again, virtually no difference for a factor of 4 or 5 price difference.

    What they didn't look at (because you don't really benchmark it) is load times, I think the FX processors have a much faster memory subsystem if you have a good SSD than their Phenom counterparts, but otherwise someone should take a bulldozer to bulldozer.

    If we were to revisit the oft used car analogy for computing, it's a fair assertion that which brand of car you buy won't help you get to work any faster day to day, slightly better cars, with faster pickup etc will have a small (but measurable benefit) but that's about it. Well, unless you buy a land rover, or a BMW 7 series (http://www.lovemoney.com/news/cars-computers-and-sport/cars/12461/the-country-that-makes-the-most-reliable-cars, http://www.reliabilityindex.com/ ), at which point, you should budget time into your schedule for the vehicle to be in the shop.

  • Re:Err (Score:4, Interesting)

    by Cute Fuzzy Bunny ( 2234232 ) on Thursday August 23, 2012 @07:04PM (#41103099)

    For years, absolutely nobody has maintained that CPUs have little impact on gaming performance; all you need is a god-tier video card setup, and a game engine that magically handles everything via GPU.

    There, I fixed it.

    Seriously, this has to be the most nonsensical Slashdot summary I've read all day. CPU hasn't been a minor factor in gaming for several gaming aeons now, and there are no shortage of games that are critically dependent on it (Hi, Skyrim!).

    Check out your favorite hot deals web site. The mantra is a celeron or any old amd chip made in the last 5 years plus a solid gpu = goodness. I coiuld point you to dozens of threads where this is the defacto standard.

    But thats what you get when you combine cheap with minimal knowledge. Eventually everyone becomes convinced that its true.

  • Re:Err (Score:4, Interesting)

    by snemarch ( 1086057 ) on Thursday August 23, 2012 @07:10PM (#41103181)

    PunkBuster spiking to 20-30% CPU is, as you mentioned, a bug - it is not the norm. And while people won't be shutting down every background process to play a game, they don't tend to run anything heavy while gaming. And all the regular stuff (web browser with a zillion tabs loaded, email client, IM client, torrent client, ...) is pretty negligible CPU-wise.

    I personally haven't run into games that can utilize more than two cores (please let me know if they're out there!), and even then there's usually been synchronization issues that has kept the game from reaching 100% core utilization, even on the slower cores. Parallelizing stuff is hard, and outside of the core graphics pipeline (which runs mostly on the GPU), there's so much stuff that needs to run in strict order in a game engine. I sure do hope clever programmers will think of improvements in the future, though, since we'll hit the GHz sooner or later - and then we need to scale on number of cores.

    As things are right now, I'd still say a faster dualcore is more bang for the buck than a slower quadcore, gamewise - but that might change before long. And considering that the current crop of CPUs can turbo-boost a couple of cores when the other cores are inactive, it's obviously better to shop for a quadcore than a dualcore - but with the current crop of games, you'd effectively be using the CPU as a faster dualcore when not running intensive background stuff :-)

    You can't really compare the consoles directly to x86 CPUs, btw, the architecture is radically different - moreso on the playstation side than the xbox (and let's ignore the original xbox here, for obvious reasons :)). I wonder if Sony is going to keep up their "OK, this is pretty whacky compared to the commodity multicore stuff you're used to, but it's really cool!" approach, or if they'll settle for something "saner".

  • how to tell (Score:4, Interesting)

    by Andrio ( 2580551 ) on Thursday August 23, 2012 @08:01PM (#41103663)
    In a game, look at the sky. If your framerate shoots up, the video card was your bottleneck. If it doesn't, your CPU is.
  • by Anonymous Coward on Friday August 24, 2012 @01:30AM (#41105853)

    No, not really.

    I assume you are referring to the fact that when you look at the sky the game engine culls (skips rendering) most of the objects in the scene, therefore the GPU has less to do and if you are not CPU bound the frame rate shoots up. However when you are not looking at the sky BOTH the CPU and GPU load increases and your test does not reveal which has now become the bottleneck.

    Your test only confirms the obvious: that it takes less resources (CPU and GPU) to render the sky than a full scene.

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...