Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) Games IT

Study Shows Gamers At High FPS Have Better Kill-To-Death Ratios In Battle Royale Games (hothardware.com) 149

MojoKid writes: Gaming enthusiasts and pro-gamers have believed for a long time that playing on high refresh rates displays with high frame rates offers a competitive edge in fast-action games like PUBG, Fortnite and Apex Legends. The premise is, the faster the display can update the action for you, every millisecond saved will count when it comes to tracking targets and reaction times. This sounds logical but there's never been specific data tabulated to back this theory up and prove it. NVIDIA, however, just took it upon themselves with the use of their GeForce Experience tool, to compile anonymous data on gamers by hours played per week, panel refresh rate and graphics card type. Though obviously this data speaks to only NVIDIA GPU users, the numbers do speak for themselves.

The more powerful the GPU with a higher frame rate, along with higher panel refresh rate, generally speaking, the higher the kill-to-death ratio (K/D) for the gamers that were profiled. In fact, it really didn't matter hour many hours per week were played. Casual gamers and heavy-duty daily players alike could see anywhere from about a 50 to 150 percent increase in K/D ratio for significantly better overall player performance. It should be underscored that it really doesn't matter what GPU is at play; gamers with AMD graphics cards that can push high frame rates at 1080p or similar can see similar K/D gains. However, the new performance sweet spot seems to be as close to 144Hz/144FPS as your system can push, the better off you'll be and the higher the frame rate and refresh rate the better as well.

This discussion has been archived. No new comments can be posted.

Study Shows Gamers At High FPS Have Better Kill-To-Death Ratios In Battle Royale Games

Comments Filter:
  • by Bourdain ( 683477 ) on Thursday March 07, 2019 @01:15PM (#58231876)
    Could it perhaps be that those with more expensive rigs are just more serious gamers who play more and are thus more skilled?

    Further, the headline had me scratching my head for a moment as I wasn't sure if it were people who were high playing a FPS...
    • by Anonymous Coward

      From TFS...

      "In fact, it really didn't matter hour many hours per week were played. Casual gamers and heavy-duty daily players alike could see anywhere from about a 50 to 150 percent increase in K/D ratio for significantly better overall player performance."

      captcha: reading

      • by Calydor ( 739835 ) on Thursday March 07, 2019 @01:35PM (#58232066)

        Did they try to take players who were used to playing at 30 FPS and gave them a 120 FPS rig, did they try to take players used to playing at 120 FPS and gave them a 30 FPS rig, or did they just compare current FPS levels with K:D levels?

        • by gweihir ( 88907 )

          Naaa, that would have been a) scientifically sound and b) would very likely not have shown the outcome they so desire to push their products.
          Cannot have truth in advertising. It is bad for business. And there are enough morons with money to spend that will believe this crap.

          Actual reality is that human visual latency and motor-reactions are so slow that above 25PFS or so, there is no relevant difference in reaction speed in almost all cases. Sure, if you are really, really good, you can maybe get 0.1...1% m

          • Re: (Score:2, Insightful)

            FPS has a huge impact. visual latency and motor-reactions are so slow that above 25PFS That is simply wrong. 25FPS is the lowest minimum to don't be distracted by flickering, that is all.

            When the game world is updating with 50FPS then a screen with higher frequency is better. Most important however is the mouse and how it is synched with the game FPS and screen FPS.

            With higher screen FPS it is simply much easier to correctly aim!

          • by mlheur ( 212082 )

            I would consider 100fps display to be a bit low for 25fps receiver.

            Imagine you have a program who's purpose is to display the current time. The clock only needs to update once per second; so setting a timer to check the system clock once every 0.25 seconds and update the display as appropriate - that would actually be jarring for a human to watch. The seconds indicator would update anywhere from 0.76 to 1.24 seconds in between; I consider that a huge gap. I would write my program to check at least once e

          • by Bengie ( 1121981 )
            The brain doesn't see in frames, but a continuous integration. Frames mess with the visual system's ability to motion track. When playing World of Warcraft and there's lots of FX going on, if the FPS dips below 50, it gets hard to track players. This negatively affects my ability to properly position myself.

            Back when I played FPS games on my CRT, I had a 75hz CRT and my friend had 85hz. His video card could maintain a stable at least 85fps. At first I was playing and it felt "smoother". Then he changed ga
        • Exactly, you'd have to do something like that for the results to be meaningful. Data from people playing at home on their own machines will have all kinds of bias.

      • While they only played Battle Royal games for 5 hours a week... that might becuase they are playing other more traditional FPS games such as COD etc. Even additionally playing Warcraft or Minecraft for 30 hours a week will likely result in them having better hand-eye coordination and reflexes over someone that plays less games in general, and someone who plays many different games will be more likely to spend more on their gaming system that someone that just plays one game as they also more likely spend m

      • That only disproves play time as a third factor which increases both.. It could be that players who found themselves to be naturally better at playing these games have invested more heavily in gaming rigs.

        In fact, one of their graphs seems to show exactly this. If you look at the zero axis of the kill/death ratio increase vs hours played graph [hothardware.com] for different GPU owners, you see that all four do not converge to zero at zero hours played. Those with higher-end hardware have a better K/D ratio even with m
        • If I only reacted by clicking on the pixels corresponding to the other players head, I'd never be able to hit anybody half competent at dodging. Aiming in an FPS is much much more than simply clicking on the pixels coresponding to an enemies head. There's a continual integration between eye->brain image processing->brain motor planning->hand->mouse, as well error correction based on proprioceptive feedback from my mouse hand compared to what my eye is seeing.

          If I see another player begin to m
    • by gweihir ( 88907 )

      Correlation is not causation. A favorite beginner's mistake, repeated time and again and often made intentional to push faulty ideas in politics, marketing and other disciplines primarily focused of creating a false reality via lies. If you have A~B, you can have A->B or B->A or, and that is often the real situation, there is a C with C->A and C->B.

      In this case here, it is very likely that more serious shooter gamers (C) have both better kill rates (A) and higher FPS (B). Also, there is factor D

      • You're not wrong about any of those points, however, I noticed over a decade ago that I performed noticeably better with some mice vs. others. I later learned it had to do with the refresh rate of the mouse.
        120Hz may sound awful high to matter, and I salute your skepticism- but that's 8ms of latency. Which I assure you is a lifetime when you're doing precision hand-eye coordination that is largely autonomous.
        These results don't surprise me in the slightest. I have seen this myself in a hundred LAN parties
    • by Anonymous Coward

      Besides the obvious problem with correlation vs. causality, there's also the question of whether it's actually the screen refresh rate or something else that is causing the better performance. For example, every semi-regular Quake (anything up to Q3 at least) player would know that you generally want these games running at a "magic framerate" such as 125Hz, 250Hz or 333Hz even if your actual display device is still running at 60Hz, because you get very obvious (measurable) benefits from having your physics

    • by AmiMoJo ( 196126 )

      Perhaps, although it would be fairly easy to compare like-for-like, e.g. users with a certain GPU with or without a 144Hz monitor.

      I'm skeptical though because there is great variance between 144Hz monitors. The Amazon special ones have so much ghosting that the frame rate is unlikely to help much. There is no real way for Nvidia to measure monitor response times and control for them.

    • My thought exactly. As usual, it's probably a confluence of both factors.

      Certainly, a smoother, prettier experience is going to make the game more enjoyable, and more "fun" to play for the quantities of time it takes to get very good...but realistically, the top end graphics cards are $600+ on top of the price of a computer worth running it (just having a great video card alone means you're just going to be bottlenecked elsewhere), probably $1200-$1500 base.

      The only people dropping $2k on a desktop today a

    • ...at games, confirming decades of player biases - says NVIDIA's shill.

      That highly informative advertisement even ends with an Amazon Affiliate Link, suggesting you should buy a $500 graphic card.
      Earning hothardware.com up to 10% on all purchases made.

      It's like an advertisement inside an advertisement.
      So you can earn money for the fake journalists while you make money for them. WHAT A VALUE!!!

    • by Ranbot ( 2648297 )

      Could it perhaps be that those with more expensive rigs are just more serious gamers who play more and are thus more skilled?

      Don't stop there...
      Could those with more expensive video cards also have researched or pay for an ISP with less latency?
      Could those with more expensive video cards also have monitors with a faster refresh rate and better image quality?
      Could those with more expensive video cards also have high quality surround sound*?
      Could those with more expensive video cards also have better gaming-oriented peripherals (e.g. gaming mice, keyboards, etc.)?

      Those are factors with more impact on K/D stats than whether your vid

      • Could those with more expensive video cards also have researched or pay for an ISP with less latency?

        Even assuming you could get the latency numbers, good luck having any choice in the matter of what ISP you use.

        Could those with more expensive video cards also have high quality surround sound*?

        A pair of $10 earbuds will do you better than an expensive surround sound setup. No crosstalk between ears means you might as well have a radar on your screen pointing at sounds.

        I think that the reality is that once it's past a "reasonable" amount, having a smoother framerate is more important. If a video card is pushing 120fps to your 60hz monitor, you're not going to notice when a gunfight su

        • by Ranbot ( 2648297 )

          Could those with more expensive video cards also have researched or pay for an ISP with less latency?

          Even assuming you could get the latency numbers, good luck having any choice in the matter of what ISP you use.

          Could those with more expensive video cards also have high quality surround sound*?

          A pair of $10 earbuds will do you better than an expensive surround sound setup. No crosstalk between ears means you might as well have a radar on your screen pointing at sounds.

          Re: ISP... I have two ISP options where I live (suburbia). I understand that others may not have options.

          Re: sound.... Agreed on a headphones > speaker system. I should have/meant to say good surround sound headphones, which can be costly, but definitely give an advantage... "a radar on your screen" [as you put it]... those with an expensive video card would probably invest in their sound.

    • by Trimaz ( 4609805 )
      Higher refresh rate = Lower latency/input lag.
    • This was my first thought. Now GeForce could do that experiment with me if they'd just send me a high end gaming rig for research purposes.

    • by kqs ( 1038910 )

      A company that wants you to spend lots of money on their video equipment has done a study which proves that you will be better at games if you spend lots of money on their video equipment. Also, you will be stronger, more virile, and better smelling. It's science, so you know it's true.

  • I truly hope they win the IgNobel

  • Umm maybe that's because you can't kill what you can't see? Realistically thou of course there's an upper limit because at some rate the screen is changing faster than it is humanly possible to see.

    • by gweihir ( 88907 )

      That speed is at 24 FPS for most of the population. That is why that is frame-rate used by quality movies. You know, the classical stuff on celluloid.

      • by Anonymous Coward

        Those same movies open and close the shutter three times per frame so that the flicker is difficult to see. This shows that 24Hz and 48Hz are completely inadequate or they would have used one of those instead.

        The reason movie frame rates were so low is that film was expensive. The reason movie frame rates are so low today is a combination of nostalgia, ignorance, and the fact that the low frame rate is useful for masking bad cinematography, choreography, and acting.

      • Re:Common Sense (Score:5, Insightful)

        by DamnOregonian ( 963763 ) on Thursday March 07, 2019 @02:33PM (#58232484)
        If you're telling me with a straight face that 24FPS is smooth, I'm forced to conclude one of a few things.
        Either you have simply never seen a higher framerate newscast,
        Your brain somehow can't pick up the difference,
        Or you're just fucking lying.

        Given that the 24FPS of celluloid wasn't in any way the "most a person can notice" but a tradeoff between the cost of the celluloid and "smooth enough", I'm going with the latter. You act like you've got skin in the game... You one of those nVidia haters?
        • by Anonymous Coward

          There's a lot of misinformation in this thread. Everyone needs to go read about Flicker_fusion_threshold [wikipedia.org]:

          The maximal fusion frequency for rod-mediated vision reaches a plateau at about 15 Hz, whereas cones reach a plateau, observable only at very high illumination intensities, of about 60 Hz.[3][4]

          and

          If the frame rate falls below the flicker fusion threshold for the given viewing conditions, flicker will be apparent to the observer, and movements of objects on the film will appear jerky. For the purposes of presenting moving images, the human flicker fusion threshold is usually taken between 60 and 90 hertz (Hz), though in certain cases it can be higher by an order of magnitude.[7] In practice, movies are recorded at 24 frames per second and displayed by repeating each frame two or three times for a flicker of 48 or 72 Hz.

          p.s. There's also Chronostasis [wikipedia.org] where moving your eyes can cause you to see a frozen image for up to 1/2 a second (example [wikipedia.org]).

          • Yes, indeed. Go read about flicker fusion thresholds. Only this time, go past the summary.

            CRT displays usually by default operated at a vertical scan rate of 60 Hz, which often resulted in noticeable flicker. Many systems allowed increasing the rate to higher values such as 72, 75 or 100 Hz to avoid this problem

            For the purposes of presenting moving images, the human flicker fusion threshold is usually taken between 60 and 90 hertz (Hz), though in certain cases it can be higher by an order of magnitude.

            Next.

      • Comment removed based on user account deletion
      • That speed is at 24 FPS for most of the population.

        You clearly have no idea. In movies at 24FPS a single frame is usually made up of an exposure with sufficient motion blur that all information between the current frame and the next frame is captured. It looks smoother many thanks to the blurryness, but even that has a limit and typical bright scene where shorter exposures are necessary show jerky movements especially during camera panning which is precisely why high framerate movies are a thing.

        That is why that is frame-rate used by quality movies.

        No. No it is not.

  • by uvajed_ekil ( 914487 ) on Thursday March 07, 2019 @01:18PM (#58231906)
    Perhaps players who invest more in hardware (higher FPS players) are more dedicated than average (lower FPS) players, and their better stats are explained by their time spent, experience, and skill. This is obvious, and I bet NVIDIA didn't control for it in any statistically scientific way.

    Is anyone actually surprised that a gaming hardware manufacturer says that better/newer/more expensive gaming hardware makes gamers better at games? You know they cherry-picked stats in an obvious way, right? This is marketing, not news.
    • by Durrik ( 80651 )
      Could it also be that the data is from Nvidia and gathered anonymous from their GeForce Experience and is completely false?

      It's strange that a company that makes money selling high end video cards gets a result saying 'buy more high end video cards if you want to win'.

      I can't trust the data provided by such a vested party, concluding something that has a high monetary reward to them.
      • It's strange that a company that makes money selling high end video cards gets a result saying 'buy more high end video cards if you want to win'.

        The same way it's strange for a manufacturer of gas masks to warn you that breathing mustard gas will kill you.
        It's self serving, not strange.
        Self serving does not mean false though, and this set of data isn't a surprise to anyone who has been gaming as hardware has evolved. Particularly gamers who have upgraded their machines and noticed instantly that they simply aim better the more fluid it is. It's almost like the brain can tell the difference between 8ms between frames, and 16ms between frames, or 32

      • No kidding, this is quite obviously a marketing stunt. NVIDIA is busy, busy trying to prop up their slumping sales now that all the coin miners have stopped snapping up their kit.
  • The premise is, the faster the display can update the action for you

    Fuck's sake; this isn't a premise; it's logic: a process the poster is clearly unfamiliar with.

    Sure wish the site could afford editors with IQ's above a hundred...

  • Or you can reduce your screen resolution to 320 x 240 to increase your framerate per second (FPS) to get a faster response in first person shooter (FPS) multiplayer games. Worked great in Quake back in the day!
    • by Cederic ( 9623 )

      I found high pixel count but low graphics detail worked best in Unreal Tournament. Greater precision for your shooting but still stupidly fast screen refreshes.

      Of course, that was using a CRT monitor, and having a 1Mbps cable connection may have helped a little against all the people on dial-up. Maybe.

    • Nah just up your fov to 360 and your resolution down to 1x1, guaranteed hit!
  • Just different the one we know.
  • This reminds me of when I used to play Netrek.

    I was on a 5Mb/s cable modem, destroying pretty much everyone playing using a 56K (or less) dial-up modem.,,

  • by Anonymous Coward

    The human a visual system is adaptive and slows down at low contrast.
    Turning up brightness and contrast can probably save another couple of ms.

  • by EllisDees ( 268037 ) on Thursday March 07, 2019 @01:26PM (#58231996)

    Video card manufacturer produces study that says more expensive video cards are better.

    • Manufacturer of bigger sausages produces study showing that people who eat bigger sausages are less likely to be starving.

      Sure it's a self-serving set of data, but that doesn't mean it's fucking wrong. Use your logic.
      • Manufacturer of bigger sausages produces study showing that people who eat bigger sausages are less likely to be starving.

        I don't get it. Could you use a car analogy?

  • Who would have thought! A study by NVIDIA finds that you need to buy their high end stuff because it'll make you play better.

    Color me shocked.

  • by Pascoea ( 968200 ) on Thursday March 07, 2019 @01:28PM (#58232016)
    I couldn't possibly see a conflict of interest with a company funded study concluding that their higher priced items will give you a significant competitive edge in a competition.
  • Generally weight is weight. It's determination and dedication which overall decides if you get better.

  • lot's of people don't use GeForce Experience so there may be an big gap of people who are not reporting.

  • by SuperKendall ( 25149 ) on Thursday March 07, 2019 @01:40PM (#58232104)

    That means my absolute suck at Fortnite must be due to framerate, and not related to skill whatsoever!

  • What you really need is fast turn rate fps (and latency.) High fps running in a straight line is misleading for games. Turn stutter is bs fraud.

  • Take all the High FPS/High Kill Ratio players - give them identical rigs with identical monitors - with half the subject population's frame rate capped at 60fps and the others maxed to 144 using identical 2080ti SLI cards. Measure kill ratios. Switch populations. Measure again. Otherwise this sounds like a marketing stunt by Acer to sell some high frame rate monitors. In other news: young adult athletic males wearing Nike much more likely than non-Nike wearers to be tall NBA players.
  • Remember being called an LPB back in the day?
       

  • Study Funded and Performed by Major Graphics Card Manufacturer Finds You Should Buy A New More-Expensive Graphics Card.

    Yeah, that's definitely shocking.

  • First I want to pooint out that I think this "study" is just a markting scam. NVIDIA is trying to drum up hardware sales now that all the coin miners have gone away.

    If it's actually a legitimate study... why aren't all the privacy nuts up in arms? Where's their testing methodology? How is it possible that NVIDIA could come up with such data? How were they able to gather K/D ratios from games so that they could tie specific gamer accounts to the hardware being tested to prove the correlation?

  • Makes sense that people playing games relying on twitch reflexes would see an improvement. I doubt it helps much in games where coordinated team effort and tactics are required.

    I got high end cards last year and don't notice much difference because my favorite game is tactical in nature. MUCH prettier picture though, which was what I was after.
  • than the average amateur players.

    Well, duh. People are willing to spend more on equipments for things they like to do and are good at.

  • An aspect that this report hasn't mentioned, perhaps as it's an unfortunate and unfair issue in some games, is sadly there can also be code issues at play as well. Many modern games have still suffered from damage dealt or game physics not being frame rate independent which can directly equate to player advantages, Quake 3 for example was well known for having the fastest movement and most useful jumping speed and height on 125 FPS, the issue was fixed many years ago in some mods and subsequently Quake Live

  • These statistics are really lousy and does not say anything about if an expensive card will really increase the player performance.

    1. The "low" frame rate is 120 fps which would not by any means be considered "low" by most gamers.

    2. They only asses GPUs and do not include displays and other hardware or network lag. They have no idea of which frame rates are actually experienced by the players. There is even a graph to show that the more expensive cards are helpful even on 60 fps screens, despite that all st

  • You young whippersnappers, before most of you were born, the online shooter world was divided into LPBs and HPWs.

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...