Forgot your password?
typodupeerror
AMD Graphics Games Hardware

Frame Latency Spikes Plague Radeon Graphics Cards 158

Posted by Soulskill
from the have-you-tried-turning-it-off-and-on-again dept.
crookedvulture writes "AMD is bundling a stack of the latest games with graphics cards like its Radeon HD 7950. One might expect the Radeon to perform well in those games, and it does. Sort of. The Radeon posts high FPS numbers, the metric commonly used to measure graphics performance. However, it doesn't feel quite as smooth as the competing Nvidia solution, which actually scores lower on the FPS scale. This comparison of the Radeon HD 7950 and GeForce 660 Ti takes a closer look at individual frame latencies to explain why. Turns out the Radeon suffers from frequent, measurable latency spikes that noticeably disrupt the smoothness of animation without lowering the FPS average substantially. This trait spans multiple games, cards, and operating systems, and it's 'raised some alarms' internally at AMD. Looks like Radeons may have problems with smooth frame delivery in new games despite boasting competitive FPS averages."
This discussion has been archived. No new comments can be posted.

Frame Latency Spikes Plague Radeon Graphics Cards

Comments Filter:
  • by earlzdotnet (2788729) on Wednesday December 12, 2012 @05:53PM (#42265931)
    It seems like everyone always wants a single measurement to judge how good something is. Graphics cards have FPS, CPUs have GHz, ISPs have MB/s. What's not shown in these single number measurements are things like lag, or overheating problems, or random spikes of instability.

    Sigh. Maybe one day we'll learn that every product needs more than a single number to judge how good it is performance-wise.
  • Re:nVidia (Score:5, Insightful)

    by Anonymous Coward on Wednesday December 12, 2012 @06:05PM (#42266061)

    Any serious gamer gets the best bang for their buck to spend more on games rather than some misguided notion of brand loyalty as a significant indicator of anything.

  • by The Last Gunslinger (827632) on Wednesday December 12, 2012 @06:10PM (#42266115)
    So they just flipped the fraction and multiplied by 1000...brilliant! </sarcasm>
  • Re:nVidia (Score:4, Insightful)

    by poetmatt (793785) on Wednesday December 12, 2012 @06:40PM (#42266463) Journal

    except a statement "consistently producing quality goods" is simply inaccurate all of the time, because no brand consistently produces quality goods.

    So the statement that you are replying to, was correct. That includes HP and Sony.

  • Re:nVidia (Score:5, Insightful)

    by fostware (551290) on Wednesday December 12, 2012 @06:55PM (#42266575) Homepage

    I always bought nVidia until the 7950 / 8800 / 9800 dry solder issues. After that many RMAs and arguements with wholesalers and retailers, I bought AMD out of retaliation, and this 5870 has been rock solid. I'll be going back to nVidia for my next refresh, but for me, nVidia has not 'always just worked'.

    In fact, I'd wager no one brand 'just works' these days, since extreme capitalism is in these days, and they'll shop around for manufacturing plants and methods.

  • Re:nVidia (Score:3, Insightful)

    by Anonymous Coward on Wednesday December 12, 2012 @07:20PM (#42266807)

    In fact, I'd wager no one brand 'just works' these days, since extreme capitalism is in these days, and they'll shop around for manufacturing plants and methods.

    I'd argue that nVidia and AMD aren't brands in the traditional sense.

    The majority of video card problems are caused by card manufacturers, rather than the chipset vendors. I've had nVidia cards that weren't worth the box they came in; I've had my share of Radeon-based cards that were complete crap. Nine times out of ten, though, whining about nVidia or AMD and video card problems is something along the lines of whining about Intel because your computer's Samsung hard drive failed. You should probably be bitching out Dell.

    (Or in my case, XFX. Fool me once, shame on you. Fool me twice, I've started a list of manufacturers to avoid, because I forgot how inept you guys are.)

  • by TheLink (130905) on Thursday December 13, 2012 @05:10AM (#42270443) Journal
    There's a difference between measuring milliseconds per frame and frames per second.
    With the former your minimum resolution is one frame.
    With the latter your minimum resolution is one second.

    Because of that even the minimum FPS rate doesn't necessarily tell you how jerky or smooth the rendering is - since it's averaged out over one second.

    Taken to the extreme it could be rendering 119 frames in 1 millisecond and get stuck on one frame for 999 milliseconds, look like a frigging slide show but still show up as 120FPS. Whereas that sort of thing will stick out like a sore thumb on a milliseconds per frame graph. Hence measuring milliseconds per frame is better.

    The extreme case shouldn't happen in practice, but as the article shows (and from personal experience) the max/high latency stuff does happen. I've personally experienced this on my old ATI and Nvidia cards - my Nvidia 9800GT was slower but smoother than my current Radeon. I went ATI because the Nvidia cards were dying too often (they had a manufacturing/process issue back then). But my next card is probably going to be Nvidia. Even with Guild Wars 1 my ATI card can feel annoyingly "rough" when turning in the game - you see the FPS stay high but it's rough to the eyes if you get 60 fps by getting a few frames fast then a very short pause then a few frames fast then pause repeat ad nauseum.

    On a related note it's good to see that at least some benchmark sites are also starting to take latency/consistency into account for stuff like SSDs. A maximum latency that is too high and occurs too often will result in worse user experience, even if the overall throughput is high, and even for storage/drives.

"Consequences, Schmonsequences, as long as I'm rich." -- "Ali Baba Bunny" [1957, Chuck Jones]

Working...