Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
PC Games (Games) Hardware

NVIDIA On Their Role in PC Games Development 92

GamingHobo writes "Bit-Tech has posted an interview with NVIDIA's Roy Taylor, Senior Vice President of Content/Developer Relations, which discusses his team's role in the development of next-gen PC games. He also talks about DirectX 10 performance, Vista drivers and some of the upcoming games he is anticipating the most. From the article: 'Developers wishing to use DX10 have a number of choices to make ... But the biggest is whether to layer over a DX9 title some additional DX10 effects or to decide to design for DX10 from the ground up. Both take work but one is faster to get to market than the other. It's less a question of whether DX10 is working optimally on GeForce 8-series GPUs and more a case of how is DX10 being used. To use it well — and efficiently — requires development time.'"
This discussion has been archived. No new comments can be posted.

NVIDIA On Their Role in PC Games Development

Comments Filter:
  • Re:Just one question (Score:4, Informative)

    by merreborn ( 853723 ) on Tuesday June 26, 2007 @02:57PM (#19653433) Journal

    but they're not designed for gaming because the refresh rates are too high


    http://en.wikipedia.org/wiki/QXGA#WQUXGA [wikipedia.org]

    Apparently, the existing monitors at WQUXGA (worst. acronym. ever.) resolution run at 41hz, max. These days, top of the line game systems will pump out upwards of 100 frames/sec in some cases. A 41hz refresh rate is essentially caps you at 41 FPS, which is enough to turn off any gamer looking at blowing that much on a gaming rig.
  • Re:Just one question (Score:2, Informative)

    by Actually, I do RTFA ( 1058596 ) on Tuesday June 26, 2007 @03:45PM (#19654103)

    41 FPS for display purposes. However, many time physics/AI/etc is done "per-frame." A higher FPS will still affect those (moreso since any decently threaded game will use less resources rendering.) Most display systems cannot handle 100Hz, and most humans cannot tell the difference above 25-30 Hz. It's only games where slow displays lead to slow calculated frames that this will cause a problem. That and arrogant SOB's who claim they can tell the difference without FRAPS.

    Plus, at that resolution you are fill-bound anyway.

  • Re:Just one question (Score:2, Informative)

    by White Flame ( 1074973 ) on Tuesday June 26, 2007 @05:28PM (#19655551)

    Most display systems cannot handle 100Hz, and most humans cannot tell the difference above 25-30 Hz. It's only games where slow displays lead to slow calculated frames that this will cause a problem. That and arrogant SOB's who claim they can tell the difference without FRAPS.

    *sigh* Where do people come up with this garbage? Look at some evidence already instead of making stuff up:

    http://mckack.diinoweb.com/files/kimpix-video/ [diinoweb.com]

  • Re:Resolution (Score:4, Informative)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday June 26, 2007 @05:34PM (#19655641) Homepage Journal

    Display of the future approaching the human eyes capabilities.

    You say this like it means something. It does not. Here's why.

    The real world is based on objects of infinite resolution. Our vision is limited by two things; the quality of the lens and other things in front of the retina, and our brain's ability to assemble the data that comes in, in some fashion useful to us that we perceive visually.

    A lot of people make the mistake of believing that the finest detail that we can resolve is in some way limited to the sizes or quantities of features on the retina. This is a bunch of bullshit. Here's why; Saccades [wikipedia.org]. Your brain will use your eye muscles without your knowledge or consent to move your eye around very rapidly in order to make up for deficiencies in the eye surface and to otherwise gather additional visual data.

    Have you ever seen a demo of the high-res cellphone scanning technique? There's software (or so I hear, I saw a video once and that's all) that will let you wave your cameraphone back and forth over a document. It takes multiple images, correlates and interpolates, and spits out a higher-resolution image. (No, I don't know why we haven't seen this technology become widespread, but I suspect it has something to do with processor time and battery life.) Your eye does precisely the same thing! This leads us to the other reason that your statement is disconnected from reality; what you think you are seeing is not, repeat not a perfect image of what is before you. Your eyes are actually not sufficiently advanced to provide you so much detail if that is what it was!

    No, what you think you are seeing is actually an internal representation of what is around you, built out of visual data (from the optic nerve, which performs substantial preprocessing of the retinal information) and from memories. Your brain fills in that part of the "image" for which it does not have good information from your own mind. This is why you so commonly think that something looks like something else at first glance - your brain made an error. It does the best it can, but it only has so much time to pick something and stuff it in the hole.

    Stop trying to equate vision to a certain number of pixels. It's different for everyone, and it's only partially based on your hardware. Your brain does vastly more processing than you are apparently aware. Some people get to see things that aren't there all the time! (Or maybe it's the rest of us whose visual system has problems? Take that thought to bed with you tonight.)

  • by S3D ( 745318 ) on Tuesday June 26, 2007 @06:30PM (#19656295)

    While only sort of relating to Linux, I'd be interested to hear any comments about unlocking the potential of hardware via OpenGL.
    You can check the OpenGL pipeline newsletters [opengl.org]. Unified shader support is part of OpenGL "Mt. Evans" ARB extensions, which is targeted for the october 2007 release. "Mt. Evans" will support geometric (unified) shaders and improvement of buffer objects. Geometric shaders supported even now as NVIDIA extension (GL_EXT_gpu_shader4, GL_EXT_geometry_shader4, GL_NV_gpu_program4, GL_NV_geometry_program4 etc) . So it seems all the functionality is available through the OpenGL.
  • by Moraelin ( 679338 ) on Wednesday June 27, 2007 @05:22AM (#19660629) Journal
    Actually, you know, it's sorta funny to hear people ranting and raving about how 32 bit killed 3dfx or lack of T&L killed 3dfx, without having even the faintest clue what actually happened to 3dfx.

    In a nutshell:

    1. 3dfx at one point decided to buy a graphics card manufacturer, just so, you know, they'd make more money by also manufacturing their own cards.

    2. They missed a cycle, because whatever software they were using to design their chips had a brain-fart and produced a non-functional chip design. So they spent 6 months rearranging the Voodoo 5 by hand.

    The Voodoo 5 wasn't supposed to go head to head with the GeForce 2. It was supposed to, at most, go head to head with the GF256 SDR, not even the DDR flavour. And it would have done well enough there, especially since at the time there was pretty much no software that did T&L anyway.

    But a 6 month delay was fatal. For all that time they had nothing better than a Voodoo 3 to compete with the GF256, and, frankly, it was outdated at that time. With or without 32 bit, it was a card that was the same generation as the TNT, so it just couldn't keep up. Worse yet, by the time the Voodoo 5 finally came out, it had to go head to head with the GF2, and it sucked there. It wasn't just the lack of T&L, it could barely keep up in terms of fill rate and lacked some features too. E.g., it couldn't even do trilinear and FSAA at the same time.

    Worse yet, see problem #1 I mentioned. The dip in sales meant they suddenly had a shitload of factory space that just sat idle and cost them money. And they just had no plan what to do with that capacity. They had no other cards they could manufacture there. (The tv tuner they tried to make, came too late and sold too little to save them.) Basically while poor sales alone would have just meant less money, this one actually bled them money hand over fist. And that was maybe the most important factor that sunk them.

    Add to that such mis-haps like,

    3. The Voodoo 5 screenshot fuck-up. While the final image did look nice and did have 22 bit precision at 16 bit speeds, each of the 4 samples that went into it was a dithered 16 bit mess. There was no final combined image as such, there were 4 component images and the screen refresh circuitry combined them on the fly. And taking a screenshot in any game would get you the first of the 4 component images, so it looked a lot worse than what you'd see on the screen.

    Now it probably was a lot less important than #1 and #2 for sinking 3dfx, but it was a piece of bad press they could have done without. While the big review sites did soon figure out "wtf, there's something wrong with these screenshots", the fucked up images were already in the wild. And people who had never seen the original image were using them all over the place as final "proof" that 3dfx sucks and that 22 bit accuracy is a myth.

"It's a dog-eat-dog world out there, and I'm wearing Milkbone underware." -- Norm, from _Cheers_

Working...