Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics PlayStation (Games) Wii XBox (Games) Games

Carmack: Next-Gen Console Games Will Still Aim For 30fps 230

An anonymous reader sends this excerpt from Develop: "Games developed for the next-generation of consoles will still target a performance of 30 frames per second, claims id Software co-founder John Carmack. Taking to Twitter, the industry veteran said he could 'pretty much guarantee' developers would target the standard, rather than aiming for anything as high as 60 fps. id Software games, such as Rage, and the Call of Duty series both hit up to 60 fps, but many titles in the current generation fall short such as the likes of Battlefield 3, which runs at 30 fps on consoles. 'Unfortunately, I can pretty much guarantee that a lot of next gen games will still target 30 fps,' said Carmack."
This discussion has been archived. No new comments can be posted.

Carmack: Next-Gen Console Games Will Still Aim For 30fps

Comments Filter:
  • by stanjo74 ( 922718 ) on Wednesday December 19, 2012 @02:42AM (#42334243)
    Neither DirectX nor OpenGL support proper triple buffering to avoid tearing at variable frame rates. Because of that, if you want tear-free rendering, but cannot keep up at 60 fps all the time, you must render at 30 fps or 15 fps, but not, say 48 or 56 fps. You can render at any variable frame rate if you allow for tearing (which most games do and avoid the headache of v-sychs altogether).
  • Re:Detail (Score:5, Informative)

    by Anonymous Coward on Wednesday December 19, 2012 @02:58AM (#42334335)

    ... Considering most people can't perceive frame rates faster than 30 ...

    This myth needs to die [100fps.com].

    Everybody can perceive frame rates faster than 30 fps. In fact, almost everybody can perceive frame rates faster than 100. Check the linked article, this is really a tricky question. Some things to consider:

    - Games have no motion blur, or, as many modern games are implementing now, they use a pathetic, fake imitation that looks worse than no motion blur at all. Hence, they need much higher frame rates to show fluid motion. At 60 fps with current technology (including so-called next-gen), motion will look much better compared to 30.

    - Decades of cinema have been training most people to perceive low-quality, blurred animation as 'film quality', and smooth, crisp animation as 'fake' or 'TV quality'. Many, many people consider a 48fps Hobbit to be worse compared to a 24 fps one. This is a perception problem. Games could have the same issues, except they've evolved much faster and most people didn't have the time to get used to bad quality.

    - Consider the resolution problem. Higher resolution requires higher fidelity. At higher resolution, you'll demand higher quality textures and shading to reach similar levels of immersion, since details are now much more apparent. Same thing happens with animation and higher frame rates. This doesn't meen we should stay at low resolutions, 16 colors, black & white, or 30 fps. This just means we need to do better.

    - And... a game is not film, and latency matters. A lot. At 30 fps, you need to wait twice the time to see any feedback from your input. In most games you will just train yourself to input the commands in anticipation without even knowing a word about latency, but in action games, where your reaction time matters, latency is a problem. And many other sources of latency add to the sum, such as clumsy 'smart' TVs post-processing your images, or badly engineered 'casual' motion wi-fi controllers.

    In other words, yes, I'd rather have half the detail and 60 FPS. Except if your game is no game at all, and just a 6 to 10 hours movie. Since most of the top videogame chart entries fill this description today, I can see why many developers will remain at the 30 fps camp.

  • Re:Detail (Score:5, Informative)

    by Nyder ( 754090 ) on Wednesday December 19, 2012 @03:00AM (#42334345) Journal

    Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS? Considering most people can't perceive frame rates faster than 30, it makes a bit of sense to push more polygons instead.

    When it comes to games, you can tell the difference between 30 fps and 60 fps. TV/Movies, No, you can't. Video games, yes you can.

    I should of mentioned the reason why.

    When you shoot video you capture single pictures. When people are moving in these shots, the have motion blur. How much motion blur depends on how fast they are moving and how many shots per sec you take. Our eyes see the motion blur and our mind fills in the rest, which is why we are okay with 24 & 30 fps for movies/videos.

    When you do video games, each frame is smoother, doesn't have the motion blur that real life video would have. Granted, games started adding in motion blur, but it's not the same. This is why the more frames per sec generally make games look better and play better.

    We did cover this in the Hobbit at 48fps submission.

  • Re:Detail (Score:5, Informative)

    by frinsore ( 153020 ) on Wednesday December 19, 2012 @03:19AM (#42334415)

    For a 60fps game there's about 16ms per frame and with current gen consoles about 8ms is lost to API call overhead on the render thread. Of course current gen consoles are years behind and constrain rendering APIs to be called from a single thread but I'd still be very surprised if there was a console that could support a triple A game above 70fps in the next 10 years (for resolutions 720p and above).

    You've barely scratched the surface of input to perception lag, here's an answer by Carmack to people questioning another one of his tweets:
    http://superuser.com/questions/419070/transatlantic-ping-faster-than-sending-a-pixel-to-the-screen [superuser.com]
    Of course most engines come from a single threaded game mentality where they'd poll for input, apply input to game state, do some AI, do some animations, calculate physics, then render everything and repeat. Current gen consoles has freed that up some but most engines didn't go above 2 or 3 major threads because it's a difficult problem to re-architect an entire engine while it's being used to make a game at the same time. Sadly the better games gave user input it's own thread and polled input every 15ms or so, queued it up, and then passed it on to the game thread when the game thread asked for it. Input wasn't lost as often but it didn't get to the game any faster.

  • by tepples ( 727027 ) <tepples.gmail@com> on Wednesday December 19, 2012 @10:57AM (#42336441) Homepage Journal

    but how do you do progressive when the hardware is built for interlaced?

    The vertical sync pulse is delayed by half a frame before odd fields according to this diagram [sxlist.com]. Delay it and the analog hardware will begin retrace a half scanline later, which produces an odd field. Don't delay it and the TV interprets it as an even field.

    We're talking analogue TV sets here - they DON'T DO progressive. Period.

    Then how does my analog TV set do progressive when my NES, Genesis, Super NES, original PlayStation, or Nintendo 64 is connected to it? Question mark?

The one day you'd sell your soul for something, souls are a glut.

Working...