Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics PlayStation (Games) Wii XBox (Games) Games

Carmack: Next-Gen Console Games Will Still Aim For 30fps 230

An anonymous reader sends this excerpt from Develop: "Games developed for the next-generation of consoles will still target a performance of 30 frames per second, claims id Software co-founder John Carmack. Taking to Twitter, the industry veteran said he could 'pretty much guarantee' developers would target the standard, rather than aiming for anything as high as 60 fps. id Software games, such as Rage, and the Call of Duty series both hit up to 60 fps, but many titles in the current generation fall short such as the likes of Battlefield 3, which runs at 30 fps on consoles. 'Unfortunately, I can pretty much guarantee that a lot of next gen games will still target 30 fps,' said Carmack."
This discussion has been archived. No new comments can be posted.

Carmack: Next-Gen Console Games Will Still Aim For 30fps

Comments Filter:
  • Detail (Score:4, Insightful)

    by Dan East ( 318230 ) on Wednesday December 19, 2012 @02:20AM (#42334123) Journal

    Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS? Considering most people can't perceive frame rates faster than 30, it makes a bit of sense to push more polygons instead.

    • Re:Detail (Score:5, Insightful)

      by Radres ( 776901 ) on Wednesday December 19, 2012 @02:21AM (#42334135)

      I think Carmack's point is that the other studios will push half the content at 30fps because they're lazy.

      • That's not really how it works. Every dev I've ever worked with in the games industry, all aim for 60fps. Given enough time and resources, that's what they'd all end up delivering. Since they are never given enough time or resources, by either management or their publishers, there will be a time when the only option is to drop to 30fps. It has nothing to do with laziness, and everything to do with money.
    • Re:Detail (Score:5, Insightful)

      by epyT-R ( 613989 ) on Wednesday December 19, 2012 @02:38AM (#42334215)

      Not this again.. This assumption is based on perceived motion from frames containing captured motion blur and even in such (24/30hz) frames, motion is NOT transparent to most people. With games there is no temporal data in frames, so it's VERY obvious. Even 60 is to many gamers, and is why they opt for 120hz (real 120hz, not hdtv '120' interpolated which looks terrible) panels and video cards that can push them.

      Then there is input lag. Its perceived turnaround time is very noticeable at 30fps, and if the rendering is not decoupled from the input polling/irq, the latter's latency actually does go up. id had to patch quake 4 to make it acceptable to play because the 60hz was dropping inputs and looked choppy as hell compared to previous releases. Enemy Territory quake wars, which is also idtech4, was locked at 30 and was deemed unplayable by many.. I think it was one of the reasons the game tanked. It was actually painful to look at in motion.

      Console devs always push excessive graphics at the expense of gameplay because the publishers want wow factor over playability. This was true in the 8bit and 16bit days too. Some games suffered so badly they were deemed unplayable. This is why pc gamers value useful graphics configuration capability in their games. Often what the publishers/devs thought as 'playable' was not what the community thought was playable, not that this should shock anyone with today's 'quality' releases.

      • Re:Detail (Score:5, Informative)

        by frinsore ( 153020 ) on Wednesday December 19, 2012 @03:19AM (#42334415)

        For a 60fps game there's about 16ms per frame and with current gen consoles about 8ms is lost to API call overhead on the render thread. Of course current gen consoles are years behind and constrain rendering APIs to be called from a single thread but I'd still be very surprised if there was a console that could support a triple A game above 70fps in the next 10 years (for resolutions 720p and above).

        You've barely scratched the surface of input to perception lag, here's an answer by Carmack to people questioning another one of his tweets:
        http://superuser.com/questions/419070/transatlantic-ping-faster-than-sending-a-pixel-to-the-screen [superuser.com]
        Of course most engines come from a single threaded game mentality where they'd poll for input, apply input to game state, do some AI, do some animations, calculate physics, then render everything and repeat. Current gen consoles has freed that up some but most engines didn't go above 2 or 3 major threads because it's a difficult problem to re-architect an entire engine while it's being used to make a game at the same time. Sadly the better games gave user input it's own thread and polled input every 15ms or so, queued it up, and then passed it on to the game thread when the game thread asked for it. Input wasn't lost as often but it didn't get to the game any faster.

        • by epyT-R ( 613989 )

          Yeah, I read about that.. some games/drivers/engines are absolutely terrible. I think I was spoiled by the earlier quakes.. of course they had bugs too, but todays games are terrible. I suppose not everything is a competitive shooter, but that doesn't mean it should drop or lag input.. It makes the game incredibly frustrating to play.

        • For a 60fps game there's about 16ms per frame and with current gen consoles about 8ms is lost to API call overhead on the render thread.

          The perceptual limit is around 15ms, so with your numbers that speaks to a 120Hz frame rate, effectively, being the human-factors base value for seamless playability.

          I suppose one day we'll look back at sub-120Hz games as having that 'old-fashioned' look.

      • ... how old style arcade games running on 50/60Hz interlaced CRTs managed to produce smooth flicker free motion?

        • by tepples ( 727027 )
          Old-style arcade games and every game console prior to the Dreamcast forced the interlaced CRTs into a non-standard progressive mode called 240p by the retro-gaming community. And though the scrolling on these was at 60 Hz, the actual sprite animation was occasionally as low as 8 Hz because old 2D raster graphics systems didn't support real-time inbetweening [wikipedia.org] of sprite cels.
          • by Viol8 ( 599362 )

            "Old-style arcade games and every game console prior to the Dreamcast forced the interlaced CRTs into a non-standard progressive mode called 240p"

            240 frames progessive? I doubt that - the CRT hardware couldn't have done it. Did you mean 24 frames? Even if you did , CRT TV sets receiving a signal through the RF input would have still have been doing 50/60hz refresh.

            • How on earth do you translate 240p to "240 frames progressive" without making the [effectively] industry-standard terms "480i", "480p", "720p", "1080i", and "1080p" equally meaningless?

              It means 240 scanlines progressive - old NTSC television sets normally like to run at 480i, but they're tolerant enough to handle video signals which don't have the extra half-scanline at the end of each frame and display it non-interlaced.

              • by Viol8 ( 599362 )

                Rubbish. The hardware is built for interlaced - it has no way of knowing that it shouldn't skip a scanline line because its a progressive signal. All you'll see with a progressive signal is the screen flicking between each half of the picture spread across the whole screen with single line blank gaps.

            • 240p refers to the vertical resolution. aka 320x240 progressive.

              Easiest way to see it in action is to play a PSone game that does 240p (Like PSone Diablo) on a PS2....using component cables connected to an HDTV. Some HDTV's like mine have trouble syncing to a 240p signal over component (I would have to toggle inputs till it syncs) Play the same game over S-Video and it's fine.

      • by tlhIngan ( 30335 )

        The real reason is if you want to target 60, you have to aim higher because if you just take a bit too long, your framerate drops dramatically.

        Target 30, and you can probably render everything in time and have time to spare. But target 60 and miss, and you'll stutter, visibly.

        That's the real issue - it's also why PC gamers go for the fastest video card even though their monitors may only refresh at 60Hz or so - you need to be able to do 60+ fps constantly in order to hit 60 fps solidly. Dip below that and y

    • by MrHanky ( 141717 )

      Most people certainly can perceive frame rates faster than 30 FPS. The difference between 30 and 60 FPS when playing a game on a modern LCD display is huge. Stop perpetuating dumb myths.

    • Considering most people can't perceive frame rates faster than 30

      [Citation Needed]

      The difference is very noticeable, but the "problem" is reduced due to enormous input lag that is present in most console setups. Also in action heavy scenes you will notice it less that everything is moving less smooth.
      The difference in 30 fps vs 60 fps for cameras is less noticeable due to motion blur unless you slow down the rendering. Sure you can make 30 fps games look smoother by applying motion blur, but that only makes the end result blurrier.

    • Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS?

      It depends entirely on the game. In a twitch shooter like Quake where you expect constant feedback, things feel drastically wrong at 30fps. In a single-player shooter? These are rarely built for competitive players, and don't need quick response. I can handle 30fps if it has decent motion blur, like Crysis. In an RPG? 30fps is mildly annoying but playable.

      But that's only what I can tolerate, without shelving the game for a future video card. If I had a choice? I'd pick 60fps over 30fps every time. It's one

    • by Omestes ( 471991 )

      Considering most people can't perceive frame rates faster than 30..

      Can we please stop with this falsity already? In an FPS, you most assuredly can tell the difference between 30 and 60fps. More frames, means more, smoother, motion, which means higher accuracy. 30fps also looks a bit "juttery" with fast motions, especially with digital graphics, since there is no recorded motion blur to cover it up. Also, why all the brouhaha over the Hobbit being at 48fps and not the standard 24, if no one could notice it?

    • Re:Detail (Score:5, Informative)

      by Anonymous Coward on Wednesday December 19, 2012 @02:58AM (#42334335)

      ... Considering most people can't perceive frame rates faster than 30 ...

      This myth needs to die [100fps.com].

      Everybody can perceive frame rates faster than 30 fps. In fact, almost everybody can perceive frame rates faster than 100. Check the linked article, this is really a tricky question. Some things to consider:

      - Games have no motion blur, or, as many modern games are implementing now, they use a pathetic, fake imitation that looks worse than no motion blur at all. Hence, they need much higher frame rates to show fluid motion. At 60 fps with current technology (including so-called next-gen), motion will look much better compared to 30.

      - Decades of cinema have been training most people to perceive low-quality, blurred animation as 'film quality', and smooth, crisp animation as 'fake' or 'TV quality'. Many, many people consider a 48fps Hobbit to be worse compared to a 24 fps one. This is a perception problem. Games could have the same issues, except they've evolved much faster and most people didn't have the time to get used to bad quality.

      - Consider the resolution problem. Higher resolution requires higher fidelity. At higher resolution, you'll demand higher quality textures and shading to reach similar levels of immersion, since details are now much more apparent. Same thing happens with animation and higher frame rates. This doesn't meen we should stay at low resolutions, 16 colors, black & white, or 30 fps. This just means we need to do better.

      - And... a game is not film, and latency matters. A lot. At 30 fps, you need to wait twice the time to see any feedback from your input. In most games you will just train yourself to input the commands in anticipation without even knowing a word about latency, but in action games, where your reaction time matters, latency is a problem. And many other sources of latency add to the sum, such as clumsy 'smart' TVs post-processing your images, or badly engineered 'casual' motion wi-fi controllers.

      In other words, yes, I'd rather have half the detail and 60 FPS. Except if your game is no game at all, and just a 6 to 10 hours movie. Since most of the top videogame chart entries fill this description today, I can see why many developers will remain at the 30 fps camp.

      • by dywolf ( 2673597 )

        Yes, that 0.016 second of difference betwen 30 and 60 fps matters. Yup. that's some super high latency there. It really throws off the shots.
        I mean there I was firing the gun, waiting that extra 0.016 of a second to see where the impact landed before firing another shot, repeating this action a few hundred times per second....

        Oh and a no true scotsman fallacy too, in the form of a personal opinion that no recent game is a -REAL- game but really just a long movie.

        Bravo.

    • Re: (Score:2, Interesting)

      by Gerzel ( 240421 )

      Also depends on what they are doing with that extra processing power. Are you making a game that is more intuitive? That reacts and learns better? That has AI that is more intelligent that adds to game play?

      Really 30fps is the range of reasonable quality. You get a diminished return as you increase fps especially if the rest of the game doesn't perform to the same standard.

    • by dnaumov ( 453672 )

      Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS? Considering most people can't perceive frame rates faster than 30, it makes a bit of sense to push more polygons instead.

      Stop with this misinformation. Most people definately CAN percieve framerates faster than 30.

      http://boallen.com/fps-compare.html [boallen.com]

      If you honestly cant tell the difference between 30 and 60 in the above link, you might want to have yourself checked.

    • by Fr33z0r ( 621949 )
      A valid point, with one caveat: everybody can perceive frame rates faster than 30.
    • That's not a choice or *the* choice.

      The reality is, most people would like games to be programmed for actual quality and let the hardware be the issue for 60FPS and not simply let people be lazy by aiming for a low bar. You don't get double detail at 30 FPS, you get 1/4 the detail because it's targeted at consoles.

    • by ildon ( 413912 )

      I'm getting really tired of this myth being repeated all the time. Unless your vision is TERRIBLE, you can totally perceive FPS higher than 30. Just because people don't realize why something looks "weird" or "different" doesn't mean they're not perceiving the higher frames. Just look at all the reviews and posts complaining about the high frame rate version of The Hobbit. If people couldn't perceive those extra frames, they wouldn't be complaining that it "looked too real" or "like a soap opera."

      The reason

    • What I don't understand is why we would have to settle for a choice of one or the other.

      I'll take BOTH thank you very much.

    • by jon3k ( 691256 )

      Considering most people can't perceive frame rates faster than 30

      Source please

    • Considering most people can't perceive frame rates faster than 30

      Are you kidding? Not only do I notice the difference, the difference.is huge. Oh, and remember CRT monitors, with very high refresh rates at lower resolutions? Not even 60fps is "more than enough and well beyond the treshold of noticing a difference". Small dips are noticeable, too.

      Of course, I'm talking action games that require precision and aim. I am not talking about cutscenes, "mash button to trigger random crap" gameplay, puzzle games or

    • *sigh* Not this "I can't see more then 30 fps" crap again.

      Give users a CHOICE:

      Some want QUALITY
      Some want PERFORMANCE

      Who is right? BOTH !!

      Personally I prefer 72 to 100 Hz because in a HUGE multiplayer fight your framerate WILL drop. This "safety margin" (usually) guarantees the framerate will stay above 60 Hz.

      The second reason is that IF the game supports proper 3D then 30 FPS is not a helluva easier for the dev to do then trying to figure out what details to start dropping to get back UP to 30.

  • by stanjo74 ( 922718 ) on Wednesday December 19, 2012 @02:42AM (#42334243)
    Neither DirectX nor OpenGL support proper triple buffering to avoid tearing at variable frame rates. Because of that, if you want tear-free rendering, but cannot keep up at 60 fps all the time, you must render at 30 fps or 15 fps, but not, say 48 or 56 fps. You can render at any variable frame rate if you allow for tearing (which most games do and avoid the headache of v-sychs altogether).
    • by AmiMoJo ( 196126 ) *

      You don't need tripple buffering to avoid variable frame rates, you just need variable levels of detail. Rage does exactly that. As it works through the scene it has a time budget for rendering different things, and if drops detail when it notices that it is behind. It works really well, the main complaint being that sometimes it is a bit too pessimistic and drops the detail level lower than it really needs to.

  • Next Gen? (Score:2, Insightful)

    by mjwx ( 966435 )
    So next years consoles are going to be inferior to last years PC? Personally I think between PC and mobile, the console is doomed. This will never happen with iDevices but Android tablets already support HDMI out and input from bluetooth controllers. All we need is for them to get a bit more powerful (Nvidia is advertising a 6 fold power increase between Tegra 2 and Tegra 3) and a method of transfering large games (SD card) and they will become plugin replacements for consoles.

    As for real cutting edge g
  • by dirtyhippie ( 259852 ) on Wednesday December 19, 2012 @03:35AM (#42334483) Homepage

    Good lord, this entire article is based on one tweet - 107 characters. Surely we could have waited for Carmack to say something more detailed than this??

    • In the beginning was The Word.

      Now some would say Grease Is The Word, others claim The Bird is The Word.
      • Also: for those with excessively large Slashdot IDs:

        Oh! (OH!)
        Yo! Pretty ladies around the world,
        Got a weird thing to show you, so tell all the boys and girls.
        Tell your brother, your sister, and your ma-mma too,
        'Cause we're about to throw down and you'll know just what to do.
        Wave your hands in the air like you don't care.
        Glide by the people as they start to look and stare.
        Do your dance, do your dance, do your dance quick,
        Ma-mma, c'mon baby, tell me what's the word?

        Ah word up!

        Everybody say,
        Wh
    • Also:

      Good lord, this entire article is based on one tweet - 107 characters. Surely we could have waited for Carmack to say something more detailed than this??

      THIS!

  • by Sarusa ( 104047 ) on Wednesday December 19, 2012 @04:02AM (#42334583)

    It's a given that most will target 30fps since more shinies looks better in screenshots and youtube videos than 60fps does. And most consumers can't tell the difference until put a 60 and 30 fps version side by side and let them play.

    The leaked/rumored PS4/XNext specs show them as equivalent or slightly weaker than current mid-high gaming PCs, and those can't do 60 fps locked on all the recent shiny games at 1920x1080 with all effects on (except those like CoD MP that specifically target it), so it's unlikely the consoles would. Cheap components is the driver, especially for PS4.

    But there's no reason a fighting game or fps can't aim for 60fps on the new gen if it wants to. Use your shaders and effects wisely and no problem.

  • News Flash! (Score:2, Insightful)

    by Moof123 ( 1292134 )

    Game play still is more important than FPS, see: RAGE.

    A good game with low FPS is tragic, but a lame game at even the highest FPS still just sucks.

    • A better observation is that when a good game has low FPS, its disappointing but the hardware will catch up making for a nice legacy that is talked about for years (will it play Crysis?)
    • by jandrese ( 485 )
      IMHO, Rage was a fun game, with a lot more driving segments than I expected in your normal FPS. Ammo was pretty scarce but I always had enough to get by. I'm still not sure why everybody seemed to hate it. Was it the lack of chest high walls?
  • I couldn't care less about 60 fps unless I was playing a twitchy FPS or a racing game - both of which I play very, very rarely.

    Uncharted, God of War, Okami HD, Darksiders, Journey, Mass Effect, Enslaved, Pixel Junk Monsters, Heavy Rain, LA Noire, GTA4 / 5, Half Life, Ico and SOTC HD, Portal.

    None of these games NEED 60fps - they all look nice with a consistent 30 and 60 wouldn't hurt but I'd rather graphical fidelity than frame rate. ESPECIALLY with the law of diminishing returns kicking in to full effect th

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...