Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
First Person Shooters (Games) PC Games (Games) Entertainment Games

id Says 60fps Is Enough For Doom III 163

Dot.Com.CEO writes "IGN PC reports that the final version of Doom III will be capped at 60fps, quoting John Carmack as saying 'A fixed tic rate removes issues like Quake 3 had, where some jumps could only be made at certain framerates'. Will this put a stop to fans arguing whether there is a tangible benefit for frame rates over 60fps? What do Slashdot Games readers think about id's decision?" Elsewhere, there's a new preview of Doom III at C+VG, including a mini-interview with Carmack in which he comments: "Now's where it goes from being an interesting demonstration of all the technologies to being a fabulous game, and that really does all happen at the end."
This discussion has been archived. No new comments can be posted.

id Says 60fps Is Enough For Doom III

Comments Filter:
  • ...if this will make my old g400 play Doom III "better" than Quake III. I guess it all depends on whether or not this relic of the 90's will still be able to play it. I hope so, g400's are still really nice (albeit low-end) cards.
    • if the engine was capped at 2fps...
      • A g400 plays quake3 *fine*

        I dunno how much more doom3 is going to need in terms of power.

        • I dunno how much more doom3 is going to need in terms of power.

          A lot more. The last I read, it was going to run at about 30fps on a GeForce3.
        • A little over a year ago Carmack stated that the Parhelia would run Doom 3, just not nearly as well as the then-current nVidia and ATI parts.

          If I remember correctly, the Parhelia is a significantly newer card than the g400, just as Doom 3 is a significantly newer engine than Quake 3 (although they've been working on the engine for 3 years).

          Cards that were putting out 100+ fps in Quake 3 quite easily are being crippled by Doom 3, if they work at all, so I'd say good luck, but I don't think there's a chance
          • Thanks for the insight...I guess I'll have to ditch this thing after all. I'll probably get an ATI low-9000's, so it plays nicely with Linux.
            • Re:It would... (Score:4, Informative)

              by Dot.Com.CEO ( 624226 ) * on Thursday October 23, 2003 @03:04PM (#7294309)
              Just a tip. The 9200 series are actually WORSE than the old 8500. They are also Direct X 8 series cards, not 9. Spend a bit more on a Radeon 9600 (non pro), it'll last for longer. Otherwise, a Geforce4 4200 would be an excellent choice instead of the 9200. AVOID the low 9000s Radeons.
  • One downside... (Score:2, Offtopic)

    by Dutchmaan ( 442553 )
    This will definitely eliminate or hinder using Doom3 as a benchmarking tool.
    • by fredrikj ( 629833 ) on Thursday October 23, 2003 @11:03AM (#7291373) Homepage
      No, it won't. We'll just have to get accustomed to the term "number of simultaneous Doom 3 windows" rather than "number of hundreds of frames per second".
    • Re:One downside... (Score:3, Insightful)

      by Directrix1 ( 157787 )
      If you still measure the caliber of a video card in how many quadrillions of frames per second you can get, then you are messed up. You should be looking at the feature set, and seeing if it still maintains the minimum 60fps.
      • When I buy a video card, I want to know that it'll run game X at a good fps AND I want to know that'll it'll have a fighting chance at being able to run future game Z at an acceptable fps.

        One of the best way to judge that is by seeing that it has excessive capacity for running current game X.

        My current card is a GF1 DDR, and I will upgrade when I need to (probably when HL2 or DoomIII come out), and I don't want to have to buy another card when Quake4 or whatever other games come out next year.
    • Just because gameplay is locked at 60, that doesn't mean you can run a timedemo at higher speeds.

      Hopefully, it would be a shame for id to lose their standing as the default benchmark for x86 PCs.

  • Bad Idea (Score:2, Interesting)

    by xagon7 ( 530399 )
    I can HONESTLY say I can notice a difference up to 100fps framerate differences. I am not kidding. We have been playing games like this so long, it is only natural for people to adapt and get "used" to higher framerates.

    IDs games have been used for years as a benchmark, and a basis for purchasing beter hardware, and continue to support their game. They can throw that out the window now.

    I think if they MUST cap the framerate, then at least try to put it around 85. I understand the reasoning, I just wish th
    • I'm not trying to start some giant debate about the highest frame rate the human eye can perceive BUT I read this bit about the highest frame rate the human eye can perceive and if there is a limit,

      "Technically, yes. Practically, no. The technical "limit" would be anything ranging from the speed of light to how long it takes it to reach your eyes to how fast your brain can interpret it. Other things like internal chemical reactions and the state of developed neural connections should also be considered. B

      • Re:Bad Idea (Score:5, Insightful)

        by PainKilleR-CE ( 597083 ) on Thursday October 23, 2003 @11:44AM (#7291816)
        The problem really has nothing to do with how many frames the human eye can see in a second. As was almost implied by the previous reply, it has a lot more to do with your monitor's refresh rate at a given resolution. If your monitor is at 60hz, you're not getting more than 60fps whether your card puts it out or not, because the monitor isn't going to draw them (this is why v-sync is good for games, but is disabled for benchmarks, where the monitor shouldn't have any influence on the test).

        Obviously, capping the framerate was the easiest way for id to solve this particular problem, and 60 fps is generally accepted as good enough. Anyone that thinks it isn't probably hasn't played many games capped at a certain framerate (for instance, you can cap your framerate in Half-Life and many other games). Once it's capped at a certain rate, it limits the possibility for severe slow-downs when framerates drop on complicated scenes. This is the real reason that having a card that plays 200 fps on the latest game (besides the obvious issues with previous Quake games) is important, because the higher average values mean higher values for the lowest framerate in a round. If you're playing with an average of 100 fps and the game slows to 60 fps, it's going to feel like it's crawling, but if you're playing at 60 fps and it drops to 50 fps you might have trouble even noticing it. If it never drops, even better ;)

        In the end, at least they've done something to address a problem typical with their past engines. I wish they had found a more elegant solution that allowed individuals to choose higher framerates without affecting the gameplay, but something's better than nothing. The only people that will complain are the ones that spend most of their time staring at fps counters while they play or benchmarking their graphics cards with the latest demos. Maybe some people will finally figure out that their games would be more playable if they capped their framerates at a reasonable level rather than trying to buy faster hardware and tweaking their systems all day to acheive a 100fps average that gets slammed to 30 fps every time 5 people are on the screen at once.
      • Re:Bad Idea (Score:2, Informative)

        by cybergrue ( 696844 )
        The 60fps is the average maximum that most of us can see. Considering that this is twice what movies are shown at,(actually its a bit more complicated, some theaters project each frame twice to reduce flicker) and just under the refresh rate for most monitors, I dont think this is a problem. The only time I have heard this being a problem in computer games was with quick turns and other high speed events.

        I was wondering when some game maker was going to decided to cap the fps rate and concentrate on othe

        • " The 60fps is the average maximum that most of us can see. Considering that this is twice what movies are shown at,(actually its a bit more complicated, some theaters project each frame twice to reduce flicker) and just under the refresh rate for most monitors, I dont think this is a problem."

          There's an important difference between frames on film and frames on a game. Frames on film contain an average of an object's movement over the time frame; frames on a game only have the object at that point. In En
          • 3dfx realized exactly what you're saying; thirty frames per second is fine, other than the fact that graphics are freeze frames, not film frames. hence, they tried to implement their T-buffer concept, to do motion blur and so on.

            Then, the card improvement would be in features and power, not raw frame rate increase.

            • 3dfx was not exactly making waves with their cards' features at the time. With the Voodoo5, other than the T-buffer, they had just caught up to the feature set where Nvidia's TNT2 had been two generations earlier. Transformation and Lighting features were being implemented by everyone except for them, and they were left behind.

              Thus, the story ended for 3dfx.
              • They just had a different thrust. 3dfx went for film-like quality of motion without requiring massive amounts of horsepower, everybody else went for pretty lighting without requiring massive amounts of horsepower.

                Me, I don't know off hand which I'd rather go for. Possibly the first one; like antialiasing, it can make something look much better without doing very much.

          • Actually film's 24fps is pretty jerky. Especially when a scene is being panned. You can see the individual frames alternate.

            Go to the cinema and judge the graphics as you would for a 3D computer game. Sure the resolution is good etc but 24fps really is crappy when compared to 85 fps.

            As for 60fps being the max. It's not that simple. Eyes have variable sensitivity and it's not the same over the entire retina either. Most ppl can notice the flicker of a fluorescent lamp and that's like 100-120Hz (depending o
            • I don't notice fluo flicker when the tube is new, I do when it gets old, I suspect it has to do with something else than just main grid frequency.

              I do notice 60Hz monitor refresh rates and that's unbearable. 75Hz is Okay and 85Hz is perfect (for me). I think the cap at 60Hz might be a bit low.

              How long before people work out how to uncap the refresh rate?
              • I'm lucky enough to not normally see florescent lights, but i do get headaches from them.

                monitors otoh... i can't even stand 85 Hz....

                and i'm stupidly getting into the IT business - with SAFE MODE! (it's not the color depth that gets to me, but i want to puke if i look at a 60 Hz screen for more then 2 minutes)
        • The 60fps is the average maximum that most of us can see. ...
          The only time I have heard this being a problem in computer games was with quick turns and other high speed events.


          This just isn't true (the part about movies and refresh rates is close enough and mostly true). 60fps is simply a point at which most people see most actions as fairly smooth, when the framerate is consistent. The same is true of 30fps, but very few people would be happy to see the game capped there. Putting it close to the refresh
    • I seem to remember that even in the old Doom days, if you looked at the raw realtics and gametics you could come up with numbers higher than 35fps, even though the original engine was capped there. It still supplied some sort of benchmark data above the cap rate.

      Besides, when Doom frame rates started going up too far they simply used more stressful levels for better benchmarks.

      I've no doubt that some sort of many-monsters, many-colored-light-sources, many-other-things-too levels can be built that will str
    • What refresh rate is your monitor running at? Some monitors can achieve 100hz, but very few of them. If you run at an FPS higher than the monitor's refresh rate, it doesn't display the extra frames, it just displays tearing artifacts. [gstreamer.net]
      • Some monitors can achieve 100hz, but very few of them. If you run at an FPS higher than the monitor's refresh rate, it doesn't display the extra frames, it just displays tearing artifacts.

        It can also happen if you're running at lower fps than the monitor's refresh rate. This is why someone invented v-synch. As long as v-synch's enabled, tearing should be a thing you never see, regardless of your framerate.

        Of course, since v-synch limits your framerate to your refresh rate (since it waits for the vertical
        • Re:Bad Idea (Score:3, Informative)

          by Sevn ( 12012 )
          True,

          But with quake3 at least, you want to be running at least 125fps with vsync turned off so you can nail your strafe jumps. There are some jumps and other movement oriented things you simply can't do as well unless the game is running at 125fps or higher. I'll take some tearing for some frags.
      • Samsung SynchMaster 955df.. 1024x768 @ 100Hz
    • As Carmack said ""The game tic simulation, including player movement, runs at 60hz, so if it rendered any faster, it would just be rendering identical frames."

      It seems with DOOM, you can't get any benefit from higher fps since the inbetween frames are identical. Maybe what you notice has more to do with the difference between the refresh rate of your monitor and the fps of the game?

      I think where this will suck the most if for those of us who are happy with 40fps but don't have the hardware to run Doo

      • Re:Bad Idea (Score:5, Insightful)

        by Jagasian ( 129329 ) on Thursday October 23, 2003 @01:04PM (#7292804)
        This still isn't true. Even if the game world runs at 60hz, your mouse runs at, most likely, 125hz. In all of the Quakes, and the same probably applies to Doom 3, mouse looking is a local thing unrelated to the rate the world runs, but definitely related to your FPS.

        In all of the Quakes, if you have a 60fps, then you have an effective 60hz mouse sample rate. Mouse looking at 60hz is far different than mouse looking at 125hz. Hence a framerate cap will decrease the smoothness of mouse looking.

        Really bad to hear they are doing this.
    • "IDs games have been used for years as a benchmark, and a basis for purchasing beter hardware, and continue to support their game. They can throw that out the window now."

      I would imagine they would include a -bench type switch that will throw the game in a non-playable type benchamark mode where the cap will be lifted, just to run benchmarks on.
    • by Ayanami Rei ( 621112 ) * <rayanami@nOSPaM.gmail.com> on Thursday October 23, 2003 @09:07PM (#7296995) Journal
      Here's the thing: The "refresh rate" of the human eye is about 15Hz. It's not really fair to call it a refresh rate since different parts of the eye transmit light level changes at different rates (faster in your periphial vision). It's just that on average, the cones and rods in your eyes take about 60ms to "settle".
      Of course, the eye isn't taking snapshots, over this 60ms you're sort of summing the incoming light over the entire period, and transmitting the "average".

      Thus, we see everything with motion blur pre-attached. Our brains and optical centers are wired to use that blurring to reconstruct the missing motion. Also used in the reconstruction: the fact that different parts of your eye can update independantly, so you have this distributed stream of information coming in whenever, and your brain is assimilating all of it and using all of it (including differences in timing) possible to get the best representation.

      Okay, so then clearly, we need really high framerates because our eyes are not cameras telesynced to the monitor... we need to have the pixels as accurate as possible at every time because we never know when a cells in a region will want to transmit. We "figure out" that we are seeing stop frame animation, even if you saw the "snapshots" of the eye, it still would look somewhat blurred and layered. This is the brain postprocessing outsmarting our technology.

      Easy fix? You can use a lower framerate if you add the blur. This is why we can sit through movies at 24 FPS, or TV at 30, since the recording equipment is averaging it for us. Of course, we can still sense the "syncronous" nature of the screen updating, especially when comparing camcoder stock at 60fps and movie stock at 24.

      But you don't need to update the screen nearly as fast. 60fps with motion blur is about as realistic as one could ever hope for. Have you seen HDTV football broadcasts! My god!

      So if Doom supports capping the output framerate at 60fps, but internally allows motion blurring by rendering at twice that followed averaging, I don't think you'll notice it at all. As far as your eyes are concerned, you're screaming at 120.

      The one thing I think is a bad thing is that they chose 60. If this is vsynced on an analog monitor (necessitating a 60Hz vert refresh), it kind of sucks. I can "see" the refresh on lots of monitors at 60, but it goes away at 70. I'm sure many of you know what I mean.

      I'd rather them cap it at 70, or 72. With 72 v. ref. you could update the screen at 24fps, tripling each frame, but rendering 3-6 frames when possible to create the blur.... LIKE WATCHING A REALTIME MOVIE. When it gets busy, just cut the number of frames you average.

      That would be fricking cool.
  • if my hardware could get anywhere near 60fps...

    Yet another way to feel inadequate... insufficient horsepower to make the framerate cap...

    anyway... what the heck... this 850MHz Duron box runs SuSE 8.2 perfectly well... s'pose I'll have to save up for a big box just to play the new games on...

  • by TheFlyingGoat ( 161967 ) on Thursday October 23, 2003 @10:48AM (#7291230) Homepage Journal
    Gotta love the sense of humor. :)

    The first thing we realised was like: 'Damn, people are hard to kill in Doom 3 multiplayer. Why is that?' And we looked at the damage values, the hitpoints, the armour, but eventually we realised - we're just missing. We're lousy shots."
    • Well, that's what happens when you have per-polygon hit detection--it's not just a hitbox anymore. All of a sudden, you have the ability to shoot under people's arms, next to their heads, and between their legs.

      I didn't get to play Doom III (didn't go to QuakeCon or anything), but that kind of precision sounds really cool.

      • Not to mention that per-polygon hit detection also reduces splash damage. Considering the number of people I've seen in almost any Quake-based game that rely heavily on the rocket launcher and the shotgun, I can only say this is a good thing, but that there will be complaints.
        • Doesn't really matter how you design a first person shooter anymore, there will always be a sizable percentage of people who will bitch about the various choices the developers make. Personally, I think anything devs do to reduce reliance on one single super weapon is a good thing, but there IS such a thing as overbalance.
  • by j-turkey ( 187775 ) on Thursday October 23, 2003 @10:49AM (#7291232) Homepage

    No problems here. I don't think that I can differentiate between 60 FPS and 120 FPS...although I'm no expert.

    It seems like this will fix the issue addressed in the post, and it may mess with some benchmarks that will use Doom III (anandtech, tomshardware, etc)...however, I'll bet that you will be able to turn the rate-limiting off for benchmarking purposes. Otherwise, there will most likely be no noticable effect.

    --turkey
  • Benchmarks, etc. (Score:5, Insightful)

    by BrookHarty ( 9119 ) on Thursday October 23, 2003 @10:50AM (#7291243) Journal
    Just wondering if this makes Nvidia happy, since both highend cards can push 60FPS. Cant really use Doom3 as a benchmark, unless you can override the 60FPS cap.

    And after reading the Q3 Tic, info about jumps, wouldnt that be bad code design that causes the jump distance to alter with higher FPS? Wheres a good FPS Engine coder to coment on this...
    • Right Here [tomshardware.com]

      now, some say THG is bias towards nvida. Others will probably say that this is just a 'special preview' and therefore blah blah blah

      either way, unless Doom III finds itself faster than this skimmed down release, most of the current cards can't acheive 60FPS anyway
    • Yes it does seem to be bad code design. Judging from the post, the problem seems to be caused by some combination of premature rounding to integers (performance optimisation?) and also only being able to measure the time in milliseconds which seems to cause a distinction between 60 FPS (1000/60 rounded is 17), and say 61 FPS (1000/61 rounded is 16).

      I thought windows now measured time in better intervals than that using its high-performance timers (using QueryPerformanceCounter) so I'd say this looks more

    • " Cant really use Doom3 as a benchmark, unless you can override the 60FPS cap."

      Sure you can. Use motion blur. The more in-between frames you have to create the blur, the better the blur gets. If you have a 5-frame motion blur, then you've got roughly 300fps of information there, not including the time it takes to average the frames together. This is assuming, of course, that Doom 3 can or will have any such feature.

      Frankly, I prefer this as a benchmark anyway.
    • You can just add more stuff to the Doom 3 level until none of the cards can hit 60, and see which one gets closest.
  • by m00by ( 605070 )
    it's ab00t fscking time they got some sense into them. nobody needs 300 fps on quake 3, not that any computer in existence could get close to that in doom 3 =D
  • Deathmatch (Score:5, Funny)

    by aridhol ( 112307 ) <ka_lac@hotmail.com> on Thursday October 23, 2003 @10:57AM (#7291305) Homepage Journal
    I wonder if it'll be anything like the original Doom's deathmatch - one guy wearing bright, glowing green, with three in dark colours. That was fun, as long as you weren't player 1.
  • I like the idea... (Score:4, Interesting)

    by Daniel Wood ( 531906 ) * on Thursday October 23, 2003 @10:58AM (#7291314) Homepage Journal
    Even as an individual who benefitted from the 90+fps bug, I like this idea. It really levels the playing field. Now, I just hope Doom3 plays nicely with my Dual 3200+/9700Pro or Dual 2GHz G5/9600 Pro at something close to 1280x1024(LCD).

    John Carmack just scored more points in my book. Now if only we could force everyone to use full shadows...

  • by yancey ( 136972 ) on Thursday October 23, 2003 @11:00AM (#7291340)
    Does this mean that the display will also be set to a 60 Hz vertical refresh, which causes "flicker" with 60 cycle per second light sources, like some flourescent lighting? I think I would very much prefer they cap it at 72 FPS, or at least set the vertical refresh higher than 60.
    • by Lukey Boy ( 16717 ) on Thursday October 23, 2003 @11:27AM (#7291604) Homepage
      Your monitor's refresh rate is different than the games internal clock. So you can definitely increase the refresh rate on your monitor past 60. There'll just be a bunch of duplicate draws.
    • He didn't say anything about controlling your refresh rate with the game code, so it's unlikely that'll be an issue, as long as you have your refresh rate set properly.

      Look at it this way, with v-synch on if your card can produce a solid 60 fps average or higher, at least it's less likely that you'll actually drop any of the 60 frames at 72hz.
      • What if you use a monitor that give out 85hz at 1600x1200 (NEC FP955) .. would not you want the game to calculate it's physics every 1/85th of a second, and render it ? If you run at 1/60th, some frame will be doubled to fit in between .. don't seem pretty good ! And no, I won't use 60hz.. but anyway, my CPU won't hold it ... FOR NOW :)
        • Perhaps, if the system could handle it consistently. On the other hand, if the system can handle 85 fps only 50% of the time and drops 10-25% of the time to 40fps or less, would you prefer to cap at 60fps and only suffer a 33% hit in the performance 10-25% of the time or keep it at 85 fps and suffer a 50% hit in the performance 10-25% of the time?

          It didn't take me long playing online to decide that I was better off with a lower framerate most of the time to minimize the difference between the highest and l
    • Just to be pedantic, fluorescent lights flicker at 120 Hz because there are two zero-crossings of the line voltage in each cycle.
  • by Hellvetica ( 170451 ) on Thursday October 23, 2003 @11:07AM (#7291413) Homepage
    From here: http://www.shacknews.com/ja.zz?id=8743907 [shacknews.com]
    Alright, I'm going to try and break this down... there are actually 3 entirely seperate things people are talking about here: simulation frame rate, rendering frame rate, and monitor refresh rate.


    'Hurtz' or 'hz' are a universal term that just means "X whatevers per second", so having 60FPS means your card is rendering at 60hz.

    Now, in the post Carmack says nothing about monitor refresh rate, so that really isn't anything to worry about. Your monitor will still refresh at whatever you want it to refresh at, within it's capabilities. The other two things, the simulation rate and the rendering rate are both going to be locked at 60hz/FPS.

    Let me try an analogy. Let's say you are in a room, and next door there is a chess match. The frequency at which the chess pieces are moved is the simulation or game rate. Now if you have someone taking a polaroid snapshot at a certain rate, that is the rendering rate, what everyone knows as FPS. If someone else is taking those photographs and bringing them to you, that is like your monitors refresh rate.

    This isn't a perfect analogy, but it's good enough to illustrate the point: if the chess pieces only move once per minute, no matter how often someone takes a picture of it, it will always look the same.

    60hz is a big leap for the games simulation rate though, if i recall correctly quake 3 ran at around 20 to 25 by default, but you would see inbetween stuff due to a trick called interpolation. The statement in the article seems to imply that doom 3 won't be doing any interpolation, which I think is the most interesting aspect of the comment.
    • 60hz is a big leap for the games simulation rate though, if i recall correctly quake 3 ran at around 20 to 25 by default, but you would see inbetween stuff due to a trick called interpolation. The statement in the article seems to imply that doom 3 won't be doing any interpolation, which I think is the most interesting aspect of the comment.

      There is one more rate to take into consideration and that is the sample rate for models. When objects in the game are animated, one samples the position of "bones"

  • by molo ( 94384 ) on Thursday October 23, 2003 @11:22AM (#7291538) Journal
    Quake 1 NetQuake (not quakeworld) was capped at 72 fps. Since QW, Q2 and up does client-side movement/prediction, this has been an issue on the client.

    The thing that players need to worry about is MINIMUM FPS. During a firefight, there will be more elements to draw, and hence a longer render time. Having this drop below your framerate cap is a bad thing.

    The issue with jumps being variable depending on ticrate has been a problem in Quake1 days too. In NetQuake, it was only the server ticrate which mattered. Today with client-side prediction, its the clients too.

    This just makes me say to id: fix your damn physics model! Why should the jump distance be dependant on ticrate?! Some weird quantization errors you have there.

    -molo
    • This just makes me say to id: fix your damn physics model! Why should the jump distance be dependant on ticrate?! Some weird quantization errors you have there.

      They are fixing it now -- by locking the frame rate. What I think is the source of the problem is:

      floating point mathematics has finite precision

      it is hard retrieve accurate timings when the framerate is high

      fast floating point operations are less accurate You could possibly trade accuracy for speed, but then the frame rate would drop, specia

  • Wow... (Score:2, Insightful)

    Now's where it goes from being an interesting demonstration of all the technologies to being a fabulous game, and that really does all happen at the end.

    Is it just me or does that sound horribly wrong? Actually, we also enjoy good plots, character design, thoughtful level layout, adequate difficulty, intuitive user menus, goodies to unlock...

    High fps's and good looking exploding heads don't give us a good argument when the Moms with homicidal moron children come a' suin'.

    • Now's where it goes from being an interesting
      demonstration of all the technologies to being a
      fabulous game, and that really does all happen at
      the end.

      Is it just me or does that sound horribly wrong? Actually, we also enjoy good plots, character design, thoughtful level layout, adequate difficulty, intuitive user menus, goodies to unlock...


      I think the reference has nothing to do with the technical points, but rather with the point in time of the development of the game. Id typically develops their engine
  • Now that he is "standarizing" the framerate, what features do you think will be turned by defualt on and stress the graphics engine the most? You know he will think of something.
  • Bad coding style? (Score:3, Insightful)

    by kwench ( 539630 ) <kwench79@yahoo.de> on Thursday October 23, 2003 @12:09PM (#7292161) Homepage
    I wonder which dumba^H^H^H^H^Hmathematical specialist chose to use the framerate as timeunit for all the calculations. That's highly unfair. After all, I want things to happen - even if I don't see them on my 200MHz MMX ATI Radeon. ;-)

    And actually, this timeunit problem is quite old: Remeber all those funny M$-DOS-games that you liked to play on your 8086? And then you had to switch this "TURBO" button on your 80286 to be able to play them again.

    OK, guys, to troll a little bit:
    1) Amiga programmers used to use the time.device. ;)
    2) One poster told you Hz (or 1/s) stands for Hurtz. Never heard about Hurtz, but Heinrich Hertz is supposedly upset that someone stole his idea...
    • Actually, only later Amiga programmers used timer.device, the earlier ones just grabbed the system clock from the hardware directly - less overhead.

      Mind you, on pentiums and later you can use (IIRC) the RDTSC instruction, which gives you the number of clock cycles since the last reset as a 64bit number.
    • Comment removed based on user account deletion
  • Bad solution (Score:5, Insightful)

    by MobyDisk ( 75490 ) on Thursday October 23, 2003 @12:15PM (#7292238) Homepage
    Far be it for a lowly coder to question Mr. Carmack, but this seems like a hack. The analysis Slashdot linked to indicates that the problem is caused by a series of errors:
    1) Rounding to integer units in the computations
    2) Rounding to milliseconds in frame-rate-control
    3) Using frame-dependent computations

    These aren't new problems, they are 3 no-nos in game design. Locking the frame rate has many disadvantages (pointed out in other posts). By doing this, it implies that Doom III is still using flawed calculations. How long before somebody decides to create a mod unlocks the frame-rate without fixing the problem? Maybe the timeline is the issue, so I hope it is something addressed in a future patch.

    Let me clarify:

    1) Most calculations moved from fixed-point integer math to floating-point math in Q1. Integer positions may be okay, if the number is large enough (64-bit?).
    2) Timings are accurate enough now that this should not be an issue. Many engines keep values such as the # of ms for the last frame, and the rolling average in floating-point units.
    3) This is one of the most common mistakes in game coding and demo coding. Take falling as an example:
    y1 = y0 - .5*a*t^2. Instead of computing a delta-y at each frame, you should recompute the entire equation each frame, and offset it again from the original point. This means that anything from 1fps to infinite FPS will be accurate, limited only by the accuracy of the FPU.

    Some people will argue that these approaches are slower. But for most games, only about 20% of the time is spent doing calculations for positioning, AI, etc. Most of it is rendering. So a few extra floating-point multiplications per frame is not an issue, even on a slow PC.
    • Far be it for a lowly coder to question Mr. Carmack, but this seems like a hack.

      Don't worry, it doesn't just seem like a hack, it actually is a hack.

      Unfortunately, there aren't many people out there in the same league as Carmack, so it may be a while before we see him finally decide to fix this properly, rather than just capping the framerate. Personally, I prefer the capped framerate to nothing at all, but I'd much rather see it fixed properly.

      All of that being said, I haven't gone into say the Quake
    • If someone makes a "mod" that breaks the game, why should I care? I have a simple solution, don't use the mod, silly.
    • *Some people will argue that these approaches are slower. But for most games, only about 20% of the time is spent doing calculations for positioning, AI, etc. Most of it is rendering. So a few extra floating-point multiplications per frame is not an issue, even on a slow PC.*

      unfortunately when doing videogames you have to draw the line somewhere.

      oh yeah, there's awful lot of hacks used in coding games(they're used to make the games look cool and playable), they've been used for as long as games have been
    • It is easy to recompute from the initial position for each iteration if you know that no new information is going to influence the equation after the player has jumped. However, this assumption doesn't hold, as was noted in the explanation. A rocket may come zooming in and push the player off his path. Or the player may exert "air control".

      The only way these "free will" elements can be accounted for with a recomputation from the initial position is if every acceleration vector is stored for the entirety of
    • y1 = y0 - .5*a*t^2. Instead of computing a delta-y at each frame, you should recompute the entire equation each frame, and offset it again from the original point.

      That works for your simple example, but many problems in physics simulation have no closed-form solution, and numerical integration is the only way to go about solving the problem. Numerical integration always introduces some error, depending on the method of integration used, and the larger the timestep, the larger the error (at some poin
    • I think you're misunderstanding WHY the frame rate was capped.

      My interpretation of this matter is that the game physics engine uses a FIXED and CONSTANT 60Hz time scale. So no matter how fast or slow the graphics are drawn, the piecewise linear simulations are all done with constant delta-t's. Everyone's jump follows the same trajectory, every rocket lands in the same location, etc. regardless of how many snapshots of this motion are displayed (i.e. frame rate.)

      The implication of this is two-fold:

      1. Th
  • In other words... (Score:5, Insightful)

    by stienman ( 51024 ) <adavis&ubasics,com> on Thursday October 23, 2003 @12:20PM (#7292309) Homepage Journal
    Translation:

    Our engine is wicked fast because we calculate everything with integer math. Our physics engine runs at a rate fixed according to time, not machine cycles, so that any computer fast enough for the game will run the physics exactly the same as any other machine, and so integer math round-off errors will be consistant, and can be made up for in the map design.

    We chose to fix the frame rate to the same rate as the physics engine so that video cards will not be re-rendering the same frame twice. If they did, then the game would appear jumpy.

    If you run at 72fps, and the engine runs at 60, then you'd get a duplicate frame every 5 real frames. Since the controls are tied to the physics engine, the controls would feel laggy 12 times a second, until the frame rate again caught up with the engine.

    The optimal setup will have the monitor set to 60Hz, or perhaps a higher frequency that is some multiple of 60Hz, but will result in quick beats: rather than 1 in 5, choose 90Hz so every third frame is a duplicate, or best choose 120Hz.

    Or just turn off the flourescent lights. You want a terrifying experience anyway - who wants to play with the lights on?

    -Adam
    • If you run at 72fps, and the engine runs at 60, then you'd get a duplicate frame every 5 real frames. Since the controls are tied to the physics engine, the controls would feel laggy 12 times a second, until the frame rate again caught up with the engine.

      The optimal setup will have the monitor set to 60Hz, or perhaps a higher frequency that is some multiple of 60Hz, but will result in quick beats: rather than 1 in 5, choose 90Hz so every third frame is a duplicate, or best choose 120Hz.

      If Doom 3 is

      • A 60hz cap doubles your mouse input lag from 8ms to 17ms! While your eyes have trouble distinguishing such small time differences, you can "feel" the difference when mousing with 8ms lag versus 17ms lag.

        No need to use caps, young man. You're beginning to sound like an audiophile who thinks tubes are better, and records are better than tapes and CDs.

        There is no doubt that on the old games you can 'feel' the difference in lag between 60 and 125hz update rates.

        The new game, however, does not have th
        • I played Doom 3 deathmatch at this year's Quakecon. The mouse look was far less smooth than Quake 1 and 3's mouse look. I figured it had something to do with low framerates... something which would eventually be fixed with ultra fast 3D accelerators. The fact that Id Software is capping the FPS to 60hz means that there will be no fix... mouse looking in Doom 3 will be inferior to Quake 1 and Quake 3.

          In fact, Quake 1 has the best mouse look to date because once it became opensource, developers implemente
    • "Or just turn off the flourescent lights. You want a terrifying experience anyway - who wants to play with the lights on?"

      I'm scared of the dark you insensitive clod!

  • Oh no, not a cap at 60! I would lucky to get beyond 30 FPS.
  • What the hell!? (Score:1, Redundant)

    by nomel ( 244635 )
    Why doesn't he just fix the code? I can tell a difference between 60 fps and higher. It's easy to test, just set your refresh rate to 60 Hz and see if you can see nice flickering.
    • Why doesn't he just fix the code?

      Because it's a lot harder to seperate the framerate from the back-end after the engine's coded than if you do it that way from the start. Why he did it this way in Doom 3 I couldn't tell you, except that it's the way he did it in the rest of his engines, too.

      I can tell a difference between 60 fps and higher. It's easy to test, just set your refresh rate to 60 Hz and see if you can see nice flickering.

      Umm, you're not testing whether or not you can see the difference be
      • With the monitor refresh test, I was just showing that the eyes can detect visual changes at 60 changes per second.

        But, I agree.

        That's one of the reasons I love LCD's so much. Sure the pixel refresh is slow and blurs, but it sure is nice on the eyes for reading and whatnot. :)
  • Yeah, I can tell the difference between 60fps and 75fps. 60fps gives me a headache after ten minutes.
  • Finally, we can get rid of the biggest limitation today in gaming: Rich gamers.

    By capping the frame rate at 60fps, gamers with insanely fast computers will no longer be put at an unfair advantage. This will also (finally) end the gaming age of machosim - yes, people STILL buy faster PCs to get more FPS on Quake3.

    Honestly, I find that any game above 30fps is perfectly playable. Any more than 60 seems silly. Aside from that, there are few monitors that can draw so many frames per second... I actually fin
  • Can't they make this optional, like, for example, in GTA? GTA was a lousy PS2 port and was programmed for a fixed framerate (30FPS). When ran with a variable frame rate the speed of certain in-game events changed (!), but the programmers still left the option to play with uncapped framerate. Why should Doom 3 be different, especially if variable framerate is not a problem for the engine?

    BTW, Doom3 will become irrelevant much sooner with this framerate cap. Do you think any game site would still use Quake3
  • Many quake derivatives have solved it. (BF1942, for instance?)
  • Just in case anyone fell behind on the WHY of this like I did :)

    Here it is [vuurwerk.net]

  • USE COMMON SENSE!

    When an object moves in front of our eyes or on the screen, yeah we might not need 100 frames per second. Move your outstretched hand left to right really fast in front of your eyes, and you will notice that you see a blurr of a mix of colors. This is what you guys keep talking about. Well, you guys are WRONG for focusing on the WRONG thing.

    What you guys are talking about is MOVEMENT, but you should be talking about EYE SCANNING. Yes, our eyes aren't fast enough to see individual fram

"Nuclear war can ruin your whole compile." -- Karl Lehenbauer

Working...