Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
PC Games (Games) Hardware

NVIDIA On Their Role in PC Games Development 92

GamingHobo writes "Bit-Tech has posted an interview with NVIDIA's Roy Taylor, Senior Vice President of Content/Developer Relations, which discusses his team's role in the development of next-gen PC games. He also talks about DirectX 10 performance, Vista drivers and some of the upcoming games he is anticipating the most. From the article: 'Developers wishing to use DX10 have a number of choices to make ... But the biggest is whether to layer over a DX9 title some additional DX10 effects or to decide to design for DX10 from the ground up. Both take work but one is faster to get to market than the other. It's less a question of whether DX10 is working optimally on GeForce 8-series GPUs and more a case of how is DX10 being used. To use it well — and efficiently — requires development time.'"
This discussion has been archived. No new comments can be posted.

NVIDIA On Their Role in PC Games Development

Comments Filter:
  • FTFA:

    [...]WQUXGA, 3840x2400, or nine million pixels. [...] We asked Roy what size monitors we'd see with this kind of resolution, but he didn't really give any specifics: "I think you can already buy 22" monitors with this resolution, but they're not designed for gaming because the refresh rates are too high. They also cost too much, too." I guess from that, we might see 30" monitors at 3840x2400, or we may see even bigger monitors...

    Conjecture aside, what refresh rates are they using now?

    I would have as

    • Re:Just one question (Score:4, Informative)

      by merreborn ( 853723 ) on Tuesday June 26, 2007 @02:57PM (#19653433) Journal

      but they're not designed for gaming because the refresh rates are too high


      http://en.wikipedia.org/wiki/QXGA#WQUXGA [wikipedia.org]

      Apparently, the existing monitors at WQUXGA (worst. acronym. ever.) resolution run at 41hz, max. These days, top of the line game systems will pump out upwards of 100 frames/sec in some cases. A 41hz refresh rate is essentially caps you at 41 FPS, which is enough to turn off any gamer looking at blowing that much on a gaming rig.
      • Re: (Score:2, Informative)

        41 FPS for display purposes. However, many time physics/AI/etc is done "per-frame." A higher FPS will still affect those (moreso since any decently threaded game will use less resources rendering.) Most display systems cannot handle 100Hz, and most humans cannot tell the difference above 25-30 Hz. It's only games where slow displays lead to slow calculated frames that this will cause a problem. That and arrogant SOB's who claim they can tell the difference without FRAPS.

        Plus, at that resolution you ar

        • Re: (Score:2, Informative)

          Most display systems cannot handle 100Hz, and most humans cannot tell the difference above 25-30 Hz. It's only games where slow displays lead to slow calculated frames that this will cause a problem. That and arrogant SOB's who claim they can tell the difference without FRAPS.

          *sigh* Where do people come up with this garbage? Look at some evidence already instead of making stuff up:

          http://mckack.diinoweb.com/files/kimpix-video/ [diinoweb.com]

          • *sigh* Where do people come up with this garbage?
            Digg. No really. You'd be modded down for going against group think and providing actual links.
          • I've checked various framerates. I cannot tell the difference above around 18-20 Hz (when paying attention. I've played games at 12-15 FPS without noticing anything wrong), but I recognize others can. In movies/animation/etc. 24/25(PAL)/29.97(NTSC) are standard.

            • by Endo13 ( 1000782 )
              I don't know, maybe it depends on the type of monitor. But I can easily tell a significant difference between 60hz, 70hz, 75hz, 85hz, and 100hz on any CRT monitor. The lowest refresh rate I can comfortably use on a CRT for extended periods of time is 85hz.
            • It depends on the game, and the amt of movement.

              Starwars galaxies? Yea - that's fine at 15-30 ... But try playing burnout at 15 fps and you will notice a difference...

              Above 30 I will admit, I don't really notice too much. It's more of a smoothness thing though. If 30 is as low as it goes, I won't complain too much, but if that's as high as it goes, or even the average - it probably dips much lower... and that IS noticable.
      • Re: (Score:3, Funny)

        by Fiver- ( 169605 )
        WQUXGA (worst. acronym. ever.)

        Yeah, but think of the points you could rack up in Scrabble.

      • Apparently, the existing monitors at WQUXGA (worst. acronym. ever.) resolution run at 41hz, max. These days, top of the line game systems will pump out upwards of 100 frames/sec in some cases. A 41hz refresh rate is essentially caps you at 41 FPS, which is enough to turn off any gamer looking at blowing that much on a gaming rig.

        What bothers me more is that the screen uses 16:10 aspect ratio. Seems Apple is quite fond of 16:10 for some reason (according to that link). I hate 16:10.

        In principle, 16:10
  • Heh. (Score:3, Funny)

    by michaelhood ( 667393 ) on Tuesday June 26, 2007 @02:50PM (#19653351)
    As an early 8800GTX adopter, I'd like to tell NVIDIA where they can shove this $700 paperweight..
    • Re: (Score:1, Offtopic)

      by ardor ( 673957 )
      Then why did you buy it in the first place?
      Besides, a 8800 GTX is a very good card. I chose a GTS because of power usage and price. But if you bought a $700 highend card without actually wanting it, then you are to blame.
      • Re:Heh. (Score:5, Interesting)

        by michaelhood ( 667393 ) on Tuesday June 26, 2007 @03:07PM (#19653583)
        You obviously didn't get the idea.. My problem is that the DX10 angle was played up so severely, and that the card's potential would only truly be unlocked in a DX10 environment.

        Now NVIDIA is basically advising developers to proceed with caution in DX10 implementations.

        Nice.
        • by kaleco ( 801384 )
          The DX10 situation was obvious from the beginning. DX10 support would be Windows Vista only, and it would be a while before Vista had the marketshare to justify widespread DX10 game development. Most early adopters found this out in their research, and accepted the 8800 for the benefits it brings to DX9 games, or didn't buy the card at all.
        • by LWATCDR ( 28044 )
          Don't blame Nvidia blame Microsoft.
          DX10 is Vista only. You have to look at the market share. There are a lot more XP machines than Vista. If you write to DX9 your potential market is about I would guess 100 times the size of a Vista only game.
          Notice that Microsoft's Flight Simulator 10 was written for DX9.
          But thanks buying a bleeding edge card. In three years when I pay $200 for my DX10 card it will probably be faster than your $800 card. With out people like you the rest of us wouldn't get to buy good ca
          • Pioneers get slaughtered, settlers get rich.

            Ehhh... thats possibly the most idiotic saying I've ever read on /.

            Stupid pioneers like Thomas Edison (idiot! I pay $.53 for lightbulbs today), Neil Armstrong (duh, like the moon taught us anything!) or even the British/Spanish explorers (Retards! I was born in America, hahaha, idiots risked their lives to sail boats over here).

            But, no no, I'm sure you're little saying is applicable somewhere... yeah.
            • by LWATCDR ( 28044 )
              The Britsh where settlers not explorers.

              Do you use an Altair PC? Fly on planes made by Curtis Wright Aircraft? Use Visicalc for your Spreadsheets?
              Armstrong wasn't the pioneer that would have been Robert Goddard.

              Going first always has costs and risks and more often then not it pays off for the people that go in to a land or market second or third.
              IBM wasn't the first to produce a computer they followed Sperry. Apple and IBM where not first with home computers or PCs they where following Altar and IMS
        • by abdulla ( 523920 )
          Don't forget that those features are accessible under OpenGL, which means that they will be available across platforms. DX10 is not your only option.
    • by Murrdox ( 601048 )
      Uhhh depends.

      Are you running that 8800GTX on Vista or XP?

      Because I have to tell you, I'm ALSO an early adopter of the 8800GTX. I run XP (screw Vista) and I couldn't be happier. It was worth every single penny. I haven't had a single problem with it whatsoever.

      I run all my games at 1900x1220 Resolution at maximum detail levels, and they are all gorgeous. I don't have any performance issues at all.

      If you have yourself a $700 paperweight, you've got something else going wrong besides the card i
      • I'm a fellow PC gamer enthusiast. Me and my 7900GTX are doing just fine with Vista. 1900x1200 C&C3 with max details...runs smooth.

        Yay Vista!

        (duck)

      • I have yet to hear a fellow PC gamer enthusiast say something positive about Vista.

        Leaving aside cost of the O/S, i have nothing against Vista(I use Home Premimum) at the moment. I'm using the NVIDIA 158 BETA drivers for my 8800GTS which are extremely stable with exceptional(compared to the early release drivers) performance. If i had to say anything positive about Vista and gaming, it's that loading times for games and game levels were almost halved after i moved from XP to Vista. Your mileage may vary th

        • Are the load times increased because of the increase in memory or have you ran both XP and Vista on the same hardware. I have a brother who swears up and down that XP is ten times faster then windows 98 when I installed it on his families computer. I keep telling him it is a combination of going from an older 10 gig hard drive with 128 megs of ram to a 7200 rpm western digital (8 meg cache) 120 gig drive and 1 gig of ram. He won't consider the hardware upgrade as a possibility for the increase. Because the
          • Certainly wasn't an increase in memory, i have used 2gig for a long time now. I did install Vista itself to a new SATA drive, but the games still ran from the same IDE drive. Certainly the swap file itself would have been faster, and I won't discount the hardware as having an effect, it could possibly be also because a new install of an Operating System(Windows/Linux/OSX) is usually faster due to less clutter. Unfortunately i didn't do extensive benchmarks to determine the root cause of the increase. I thin
    • Re:Heh. (Score:5, Funny)

      by kaleco ( 801384 ) <greig@marshall2.btinternet@com> on Tuesday June 26, 2007 @03:09PM (#19653607)
      Oh, it's not a paperweight, you're using it wrong. If you install it in your PC, it will improve your graphics.
    • As an early 8800 adopter (January 15th), I've been pretty happy. To be honest, though, I got the board as a Christmas gift. The Vista drivers were a little rough, but I've been around enough to know that brand new hardware + brand new OS is going to cause trouble. Suck it up and deal.

      For the first month or so, I dual-booted XP, but since the middle of March I've been running Vista only, and played Vanguard, Company of Heroes, LotRO, Civ4 and a bunch of other stuff with almost no issues. Except for Va
      • Re: (Score:3, Funny)

        by feepness ( 543479 )

        As an early 8800 adopter (January 15th), I've been pretty happy. To be honest, though, I got the board as a Christmas gift.

        You're doing it wrong.
        • by Creepy ( 93888 )
          as humorous as that is, there is such a thing as pre-order, so it is possible.

          The day I get a $700 Christmas gift (much less a pre-order) is the day my wife wins the lottery. If I get a pre-order graphics card from her, I know the aliens have truly infiltrated earth and replaced my wife with a brain eating monster. She thinks I play games too much as it is - about 10 hours a week - certainly not the 10 hours in a day I did sometimes in college (I was a binge gamer ;)
    • Re: (Score:3, Interesting)

      by illumin8 ( 148082 )

      As an early 8800GTX adopter, I'd like to tell NVIDIA where they can shove this $700 paperweight..
      I too have an 8800GTX and it's been nothing but a great card for me. All of my games play very fast in it, and it's much quieter than my previous 7800GTX. I'm not using Vista yet (sticking with XP SP2) so maybe that's why you don't like it. I have to say it is the best graphics card I've ever had.
      • by fbjon ( 692006 )
        How much heat are these things putting out these days? I'm considering something between 7600 to a 7900, probably with only a passive heatpipe. Are you saying the 8800 actually has less heat dissipation?
        • How much heat are these things putting out these days? I'm considering something between 7600 to a 7900, probably with only a passive heatpipe. Are you saying the 8800 actually has less heat dissipation?

          No, I'm not saying these things are putting out any less heat than the previous model. In fact, based on the power draw requirements (it takes 2 PCI-Express power connectors instead of 1 like most cards), I would guess this thing generates a lot more heat.

          What is better about the 8800GTX compared to my prev

        • No, it uses more power (about 2x more), but the 8800 series has a wonderfully engineered heatsink that is better than anything previously offered as standard.

          Of course, you'll only find it on the high-end cards, because those are the only cards where they can actually afford a quality cooler. Stock midrange cards use the cheapest coolers manufacturers can find, and you have to pay extra for a good cooler (or passive cooling solution).
  • by anss123 ( 985305 ) on Tuesday June 26, 2007 @03:01PM (#19653513)
    "As the only manufacturer with DirectX 10 hardware, we had more work to do than any other hardware manufacturer because there were two drivers to develop (one for DX9 and one for DX10). In addition to that, we couldn't just stop developing XP drivers too, meaning that there were three development cycles in flight at the same time."

    Didn't ATI kick out some DX10 hardware the other day? I'm sure the ATI x29xxx is DX10.

    "Our research shows that PC gamers buy five or more games per year, and they're always looking for good games with great content.

    Interesting, but makes me wonder what they lay in the definition PC gamer.

    "Tony and David are right, there are API reductions, massive AA is 'almost free' with DX10. This is why we are able to offer CSAA [up to 16xAA] with new DX10 titles - the same thing with DX9 just isn't practical. Also interesting, but I'm skeptical. Turning on AA is just one API call, how does AA affect overhead?

    "So yes we will see big performance jumps in DX10 and Vista as we improve drivers but to keep looking at that area is to really miss the point about DX10. It's not about - and it was never about - running older games at faster frame rates. Wait, rewind. Are he saying my DX7/8/9 games will run faster once Nivida gets their DX10 drivers together? Or is he saying games with DX9 level of graphics will run faster if ported to DX10?

    "Five years from now, we want to be able to walk into a forest, set it on fire and for it to then rain (using a decent depth of field effect) and to then show the steam coming off the ashes when the fire is being put out."

    No, I can do that in real life. A Pyromaniacs VS firefighters burn fest OTOH....
    • Interesting, but makes me wonder what they lay in the definition PC gamer.
      Someone who plays games on PCs, and buys at least five new titles a year, obviously...
    • Also interesting, but I'm skeptical. Turning on AA is just one API call, how does AA affect overhead?

      I'm wondering if this has more to do with an architectural change than just a software modification. Maybe DirectX 10 specifications just require the board to have a daughter die similar to what the graphics processor in the 360 has.
      • I'm wondering if this has more to do with an architectural change than just a software modification. Maybe DirectX 10 specifications just require the board to have a daughter die similar to what the graphics processor in the 360 has.

        Well, according to nVidia [nvidia.com]:

        The method of implementing CSAA in DX9 differs from DX10. This is due to a limitation in the DX9 runtime, which prohibits the driver from exposing multisample quality values greater than 7. For this reason, instead of specifying the number of coverage samples with the quality value, we simply set quality to a predetermined value which will be interpreted as a specific CSAA mode by the driver.

        So there. It looks like it's just as possible under DX9 but you can't give your devs the warm fuzzy glow of going "set supersampling to 11!"

  • by Alzheimers ( 467217 ) on Tuesday June 26, 2007 @03:04PM (#19653545)
    "Given how many copies of Vista are in use, a surprisingly small number of people came back to say they were not happy with our Vista drivers when we launched Vista Quality Assurance. Within a month the number of reported problems had been halved."

    Customers are funny, if you ignore them long enough eventually they go away.
  • Resolution (Score:3, Insightful)

    by SpeedyGonz ( 771424 ) on Tuesday June 26, 2007 @03:12PM (#19653639)
    I don't want this to sound like the famous "640k should be enough for everyone", but...

    WQUXGA, 3840x2400, or nine million pixels.

    Sounds like overkill to me. I mean, I'm used to play my games @ 1280x1024 and i feel this resolution, maybe combined with a wee bit of AA, does the trick.

    I'd rather see all that horsepower invested in more frames/sec or cool effects. I know, it's cool to have the capability, but it makes me wonder about what another user posted here regarding the 8800 being a 700$ paperweight 'cause of early adoption. You'll have a card capable of a gazillion pixels on a single frame, yet no monitor capable of showing it fully, and when finally the monitor comes out or achieves a good price/value relationship, your card is already obsolete. Null selling point there for moi.

    Just my "par de" cents.
    • Re:Resolution (Score:5, Insightful)

      by CastrTroy ( 595695 ) on Tuesday June 26, 2007 @03:23PM (#19653807)
      3DFX thought the same of 32 bit graphics. They were still making 16bit cards when everyone else was doing 32 bit. In reality they got killer performance from doing 16 bit, blowing every other card out of the water in 16 bit performace. Most of the cards that had 32 bit couldn't even run most of the stuff in 32 bit because it ran too slow. 3DFX didn't care that it didn't do 32 bit, because 32 bit was too slow, and didn't actually improve the game that much. Now 3DFX is gone. The problem is, is that a lot of gamers don't want to get the card that only supports 16bit graphics, or in this case only supports 1900x1280 resolution. Because they feel that they aren't getting as good of a product, even if they can't tell the difference.
      • Re: (Score:1, Interesting)

        by Anonymous Coward
        There is a huge difference between 16bit and 32bit graphics. 16bit graphics using textures meant for 32bit rendering makes the results appear like a badly encoded DivX/Xvid video. I'm glad 3DFX died because if they were still around we wouldn't have made such great progress like we have been doing. Could you imagine still using their half-ass'd OpenGL like graphics API GLIDE today? I sure as hell couldn't.
      • Re: (Score:3, Insightful)

        by TheRaven64 ( 641858 )
        I think you're overplaying the importance of 32-bit colour. I didn't start turning it on until well after 3dfx was dead. The thing that really killed them was the GeForce. They used to own the top end of the gamer market, and kept pushing in this direction. The only difference between their cheap and expensive lines was the number of graphics processors on them, and none of them did transform and lighting. At the time, this meant that a lot of their power (and they used a lot, and generated a lot of no
        • I'm saying that although 32 bit colour wasn't all that important, I know a lot of people who thought that 3DFX had terrible cards simply because they didn't support 32 bit. Nevermind that it was too slow to even use the feature most of the time, people liked knowing that their card supported 32 bit color, even if they could never use it. It seems to be the same thing hear. They're supporting high resolutions, just to say they support them, when in reality nobody is using these high resolutions because th
          • Re:Resolution (Score:4, Insightful)

            by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday June 26, 2007 @05:25PM (#19655521) Homepage Journal

            I'm saying that although 32 bit colour wasn't all that important, I know a lot of people who thought that 3DFX had terrible cards simply because they didn't support 32 bit.

            Well, speaking as someone who was living in Austin amongst a bunch of gaming technerds, no one I knew gave one tenth of one shit about 32 bit graphics. In fact, while 3dfx was on top, you could instead get a Permedia-based card which would do 32 bit, and which had far better OpenGL support (as in, it supported more than you needed for Quake) and which was just a hair slower :) I was the only one who had one amongst my friends, and I only got it because I was tired of the problems inherent to the stupid passthrough design.

            No, what made the difference was the Hardware T&L of the geforce line. That was THE reason that I and all my friends went with one, and THE reason that nVidia is here today, and 3dfx isn't.

            No one has yet adequately explained what the hell ATI is still doing here, but it must have something to do with having been the de facto standard for mobile and onboard video since time immemorial (until Intel decided to get a piece of these markets.) Practically every laptop I've owned with 3D acceleration has, sadly, had an ATI chip inside. And usually they do not behave well, to say the least...

            • by Moraelin ( 679338 ) on Wednesday June 27, 2007 @05:22AM (#19660629) Journal
              Actually, you know, it's sorta funny to hear people ranting and raving about how 32 bit killed 3dfx or lack of T&L killed 3dfx, without having even the faintest clue what actually happened to 3dfx.

              In a nutshell:

              1. 3dfx at one point decided to buy a graphics card manufacturer, just so, you know, they'd make more money by also manufacturing their own cards.

              2. They missed a cycle, because whatever software they were using to design their chips had a brain-fart and produced a non-functional chip design. So they spent 6 months rearranging the Voodoo 5 by hand.

              The Voodoo 5 wasn't supposed to go head to head with the GeForce 2. It was supposed to, at most, go head to head with the GF256 SDR, not even the DDR flavour. And it would have done well enough there, especially since at the time there was pretty much no software that did T&L anyway.

              But a 6 month delay was fatal. For all that time they had nothing better than a Voodoo 3 to compete with the GF256, and, frankly, it was outdated at that time. With or without 32 bit, it was a card that was the same generation as the TNT, so it just couldn't keep up. Worse yet, by the time the Voodoo 5 finally came out, it had to go head to head with the GF2, and it sucked there. It wasn't just the lack of T&L, it could barely keep up in terms of fill rate and lacked some features too. E.g., it couldn't even do trilinear and FSAA at the same time.

              Worse yet, see problem #1 I mentioned. The dip in sales meant they suddenly had a shitload of factory space that just sat idle and cost them money. And they just had no plan what to do with that capacity. They had no other cards they could manufacture there. (The tv tuner they tried to make, came too late and sold too little to save them.) Basically while poor sales alone would have just meant less money, this one actually bled them money hand over fist. And that was maybe the most important factor that sunk them.

              Add to that such mis-haps like,

              3. The Voodoo 5 screenshot fuck-up. While the final image did look nice and did have 22 bit precision at 16 bit speeds, each of the 4 samples that went into it was a dithered 16 bit mess. There was no final combined image as such, there were 4 component images and the screen refresh circuitry combined them on the fly. And taking a screenshot in any game would get you the first of the 4 component images, so it looked a lot worse than what you'd see on the screen.

              Now it probably was a lot less important than #1 and #2 for sinking 3dfx, but it was a piece of bad press they could have done without. While the big review sites did soon figure out "wtf, there's something wrong with these screenshots", the fucked up images were already in the wild. And people who had never seen the original image were using them all over the place as final "proof" that 3dfx sucks and that 22 bit accuracy is a myth.
              • Actually, you know, it's sorta funny to hear people ranting and raving about how 32 bit killed 3dfx or lack of T&L killed 3dfx, without having even the faintest clue what actually happened to 3dfx.

                I remember all that you speak of.

                1. 3dfx at one point decided to buy a graphics card manufacturer, just so, you know, they'd make more money by also manufacturing their own cards.

                It wasn't a horrible idea, but going exclusive was.

                2. They missed a cycle, because whatever software they were using to design t

      • "The problem is, is that a lot of gamers don't want to get the card that only supports 16bit graphics, or in this case only supports 1900x1280 resolution. Because they feel that they aren't getting as good of a product, even if they can't tell the difference."

        Woah woah woah... you should not be comparing 16-bit vs 32bit colour to 'high resolutions'. You could easily see the quantization errors with transparency effects like smoke or skies on 3Dfx cards, you could easily tell the difference between 16-bit a
        • Go play Xenosaga Episode 3 on the PS2 (a game released last fall) for the PS2 and compare it to any modern PC game, if that game proves anything, it proves that artists and art direction is much more important then simply having high resolution. High resolution doesn't matter much if you're art is not that great or you game isn't either.

          "Our cards are designed for playing bad and ugly games!" is not a good sales pitch...

      • 3DFX thought the same of 32 bit graphics.

        Along the same lines as what got Ati in the running in the grfx market, pre 9500.
        (they were "in" the market, yeah, but they did not matter, IMO until the 9500)

        All of the grfx being put on screen were being drawn, even if you could not see it, say
        like a house with a fence in the back yard, grass, lawn chair and other stuff that you
        can't see because you are standing in front of it.

        At the time that was a lot of CPU/GPU power being wasted, until the question "why" was
        ask

      • A 3dfx Voodoo3 card was able to compute internally at 22 bits precision. Then the final result was downscaled to 16 bits on the fly. So the card was fast (because 16 bits is faster than 32 bits) and the picture was nice (because 22 bits precision is prettier than 16 bits, and not so ugly compared to 32 bits). That was the trick, and it was fine at that time.
    • Re: (Score:1, Interesting)

      by Anonymous Coward
      There are already games that can't be run at a good (60+) framerate with maximum settings at 1920x1200 by a single GeForce 8800GTX. The person who referred to that card (incorrectly, in my opinion) as a $700 paperweight was likely referring to the problems with its Vista drivers.
    • Re: (Score:3, Funny)

      by feepness ( 543479 )
      I don't want this to sound like the famous "640k should be enough for everyone", but...

      WQUXGA, 3840x2400, or nine million pixels.


      How about five letter acronyms being enough for anyone?
    • Re: (Score:3, Interesting)

      by llZENll ( 545605 )
      On current displays yes its overkill, but on displays in 10 years or less it will be the standard, it takes a lot of pixels to cover your entire field of view. Some may argue we dont need this much resolution, but until we are approaching real life resolution and color depth, we will need more.

      Display of the future approaching the human eyes capabilities.

      60"-80" diameter hemisphere, it will probably be oval shaped, since our field of vision is.
      2 GIGApixels (equal to about a 45000 x 45000 pixel image, 1000x
      • Re:Resolution (Score:4, Informative)

        by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday June 26, 2007 @05:34PM (#19655641) Homepage Journal

        Display of the future approaching the human eyes capabilities.

        You say this like it means something. It does not. Here's why.

        The real world is based on objects of infinite resolution. Our vision is limited by two things; the quality of the lens and other things in front of the retina, and our brain's ability to assemble the data that comes in, in some fashion useful to us that we perceive visually.

        A lot of people make the mistake of believing that the finest detail that we can resolve is in some way limited to the sizes or quantities of features on the retina. This is a bunch of bullshit. Here's why; Saccades [wikipedia.org]. Your brain will use your eye muscles without your knowledge or consent to move your eye around very rapidly in order to make up for deficiencies in the eye surface and to otherwise gather additional visual data.

        Have you ever seen a demo of the high-res cellphone scanning technique? There's software (or so I hear, I saw a video once and that's all) that will let you wave your cameraphone back and forth over a document. It takes multiple images, correlates and interpolates, and spits out a higher-resolution image. (No, I don't know why we haven't seen this technology become widespread, but I suspect it has something to do with processor time and battery life.) Your eye does precisely the same thing! This leads us to the other reason that your statement is disconnected from reality; what you think you are seeing is not, repeat not a perfect image of what is before you. Your eyes are actually not sufficiently advanced to provide you so much detail if that is what it was!

        No, what you think you are seeing is actually an internal representation of what is around you, built out of visual data (from the optic nerve, which performs substantial preprocessing of the retinal information) and from memories. Your brain fills in that part of the "image" for which it does not have good information from your own mind. This is why you so commonly think that something looks like something else at first glance - your brain made an error. It does the best it can, but it only has so much time to pick something and stuff it in the hole.

        Stop trying to equate vision to a certain number of pixels. It's different for everyone, and it's only partially based on your hardware. Your brain does vastly more processing than you are apparently aware. Some people get to see things that aren't there all the time! (Or maybe it's the rest of us whose visual system has problems? Take that thought to bed with you tonight.)

        • by llZENll ( 545605 )
          "A lot of people make the mistake of believing that the finest detail that we can resolve is in some way limited to the sizes or quantities of features on the retina."

          Of course it is, if its not limited by the cones or rods, its limited by the scanning rate, optical nerve rate, or the rate at which saccades happen, IT IS LIMITED, its just that we don't know the current limits technically, even if we don't know how to calcuate them through biological measurements, they are very easy to measure through subjec
    • WQUXGA, 3840x2400, or nine million pixels.

      Sounds like overkill to me. I mean, I'm used to play my games @ 1280x1024 and i feel this resolution, maybe combined with a wee bit of AA, does the trick.

      I used to feel this way, running at 1280x1024 on a pretty decent 19" CRT. However, about a year ago I finally upgraded to a 22" widescreen LCD with a native resolution of 1600x1050 and the difference it made was astonishing. Games that supported high resolution (Company of Heroes, WoW, Oblivion, etc.) felt incredibly more open. For contrast, I recently reloaded Enemy Territory on my system, which I have to run in a 1280x1024 window because the full-screen standard resolutions look like crap on my wide-s

  • by Anonymous Coward on Tuesday June 26, 2007 @03:25PM (#19653837)
    Speaking of Nvidia PC game development. Why the hell are all their new versions of their useful utilities like FX Composer 2 (betas I tried to test) now requiring Windows XP (with SP2) and no more Windows 2000 support? Win2k and WinXP have virtually zero differences in hardware support and driver system architecture. I should know since I've programmed a few drivers using Microsoft's driver development kit and according to the docs nothing has changed from Win2k to WinXP for drivers and majority of the APIs, just additional features.

    The thing that pisses me off is that Nvidia seems to have done this for absolutely no reason at all and Windows 2000 is still a fine operating system for me. I have no reason at all to switch to Windows XP (and hell no to Vista), I especially don't care fot the activiation headaches (I like to switch around hardware from time to time to play around with new stuff and go back once I've gotten bored with it if I don't need it, such as borring a friends Dual-P4 motherboard).

    Anyway, my point/question why must Nvidia feel the need to force their customers who use their hardware for developing games into later Windows operating systems like that? Anybody got any tips on how to 'lie' or disable the windows version check to force say FX Composer 2 to install on Windows 2000? It isn't like we're talking about Windows 98 here, Win2k is a fine OS and in my opinion actually the best one Microsoft has ever done.
    • by 0racle ( 667029 )
      2000 has been end of lifed. It is not supported by MS and eventually others are also going to be dropping support for it.
      • by afidel ( 530433 )
        Huh? Windows 2000 Professional and Server are in extended support until Jan 31, 2010. There won't be any non-security related hotfixes anymore but how many people still running 2000 want anything BUT security hotfixes?
    • by archen ( 447353 )
      A lot of nvidia's driver development makes no sense. Apparently according to nvidia, Windows 2003 32bit does not exist. So much for dual monitors on that machine.
    • by dsyu ( 203328 )
      Viva la Win2K!

      Anybody got any tips on how to 'lie' or disable the windows version check to force say FX Composer 2 to install on Windows 2000?

      I too would like to know if there's a way to do this.
    • by Bardez ( 915334 )
      Damn straight.

      Windows 2000 was probably the most stable of the user OS's I've seen Microsoft roll out. XP, sure it has a firewall and all, but the only thing I like about XP over 2000 -- the ONLY THING -- is the integration of browsing into a *.zip file. That's it. The install is four times as big and just as stable. I really never saw the need to buy XP, ever. Work environments have been my main source of exposure to it.
  • Linux? (Score:2, Interesting)

    by Anonymous Coward
    My question would be how NVidia's helping the game developers write for and port to Linux. If popular cames were more compatible there, it'd be a lot easier to get converts; and I'd expect the game developers would be happy to see more of my software dollars go to their products rather to OS upgrades.
    • While only sort of relating to Linux, I'd be interested to hear any comments about unlocking the potential of hardware via OpenGL. OpenGL runs on multiple platforms, and a good driver should, in theory, allow developers to take advantage off all that fancy new "Designed for DX10" hardware. I was hoping that Microsoft's handling of DirectX 10 would encourage developers to take this kind of route, as it would allow them to not only to eventually exceed some of the limitations and capabilities of DX9, but do i
      • by S3D ( 745318 ) on Tuesday June 26, 2007 @06:30PM (#19656295)

        While only sort of relating to Linux, I'd be interested to hear any comments about unlocking the potential of hardware via OpenGL.
        You can check the OpenGL pipeline newsletters [opengl.org]. Unified shader support is part of OpenGL "Mt. Evans" ARB extensions, which is targeted for the october 2007 release. "Mt. Evans" will support geometric (unified) shaders and improvement of buffer objects. Geometric shaders supported even now as NVIDIA extension (GL_EXT_gpu_shader4, GL_EXT_geometry_shader4, GL_NV_gpu_program4, GL_NV_geometry_program4 etc) . So it seems all the functionality is available through the OpenGL.
        • So why are games being written for Direct3D? Why would a developer voluntarily chain himself to a single vendor, any vendor, let alone Microsoft.

          What would they be giving up by writing to OpenGL? It runs on Windows, right?

          • by S3D ( 745318 )
            There are some reasons: Sadly d3d drivers usually more mature - more stable, less bugs. Therefore less problem with support for different graphics cards. DirectX is an integrated SDK, which include not only 3d pipeline, but also sound, video and extensive support for 3d modeling (X file format) video memory management was better in d3d up until latest OpenGL version, which was quite important for big seamless 3d worlds with run-time block load. There is a generation of coders who don't know OpenGL - it's ea
          • Comment removed based on user account deletion
  • Some of the screenshots and videos for games like Crysis are really amazing. There's a long way to go, but we are definitely on the cusp of the next generation of games.

    This is about right, when the Xbox came out, it was about on par with PCs at the time. 6 months to one year down the track, the top of the line PCs were way ahead. Now, the 360 and PS3 (which isn't living up to the hype, most of the graphics on 360 and PS3 are about the same despite the 360 being a year older) aren't competing with the t
    • by _2Karl ( 1121451 )
      Yes, Crysis does look amazing. But so what? It's just ANOTHER First Person Shooter. I'm sick of them. To be honest I find the more realistic games become, the less I enjoy them. I want escapism. If I want to experience realistic physics I'll go outside and stand under an apple tree. Maybe I'm just jaded from working in the games industry for so long, but I often find myself pining for the "glory days" of the 90's. Point and click adventures like Monkey Island, Fantastic strategy games like Syndicate... A l
      • Sure, but Crysis, much like Far Cry, was never really about gameplay, it was all about the engine. This is the rare edition Eagles live at concert recording for the hi-fi fanatic. This is a game for gamers who are in it for the shiny hardware. You need something to put that gear through it's paces, right? This is a vehicle to show us what's possible. Hopefully the technologies and benchmarks set by this game will be adopted and integrated into future games with great gameplay.

        The industry must move fo

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...