Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
PC Games (Games) Hardware

NVIDIA On Their Role in PC Games Development 92

GamingHobo writes "Bit-Tech has posted an interview with NVIDIA's Roy Taylor, Senior Vice President of Content/Developer Relations, which discusses his team's role in the development of next-gen PC games. He also talks about DirectX 10 performance, Vista drivers and some of the upcoming games he is anticipating the most. From the article: 'Developers wishing to use DX10 have a number of choices to make ... But the biggest is whether to layer over a DX9 title some additional DX10 effects or to decide to design for DX10 from the ground up. Both take work but one is faster to get to market than the other. It's less a question of whether DX10 is working optimally on GeForce 8-series GPUs and more a case of how is DX10 being used. To use it well — and efficiently — requires development time.'"
This discussion has been archived. No new comments can be posted.

NVIDIA On Their Role in PC Games Development

Comments Filter:
  • by anss123 ( 985305 ) on Tuesday June 26, 2007 @03:01PM (#19653513)
    "As the only manufacturer with DirectX 10 hardware, we had more work to do than any other hardware manufacturer because there were two drivers to develop (one for DX9 and one for DX10). In addition to that, we couldn't just stop developing XP drivers too, meaning that there were three development cycles in flight at the same time."

    Didn't ATI kick out some DX10 hardware the other day? I'm sure the ATI x29xxx is DX10.

    "Our research shows that PC gamers buy five or more games per year, and they're always looking for good games with great content.

    Interesting, but makes me wonder what they lay in the definition PC gamer.

    "Tony and David are right, there are API reductions, massive AA is 'almost free' with DX10. This is why we are able to offer CSAA [up to 16xAA] with new DX10 titles - the same thing with DX9 just isn't practical. Also interesting, but I'm skeptical. Turning on AA is just one API call, how does AA affect overhead?

    "So yes we will see big performance jumps in DX10 and Vista as we improve drivers but to keep looking at that area is to really miss the point about DX10. It's not about - and it was never about - running older games at faster frame rates. Wait, rewind. Are he saying my DX7/8/9 games will run faster once Nivida gets their DX10 drivers together? Or is he saying games with DX9 level of graphics will run faster if ported to DX10?

    "Five years from now, we want to be able to walk into a forest, set it on fire and for it to then rain (using a decent depth of field effect) and to then show the steam coming off the ashes when the fire is being put out."

    No, I can do that in real life. A Pyromaniacs VS firefighters burn fest OTOH....
  • Resolution (Score:3, Insightful)

    by SpeedyGonz ( 771424 ) on Tuesday June 26, 2007 @03:12PM (#19653639)
    I don't want this to sound like the famous "640k should be enough for everyone", but...

    WQUXGA, 3840x2400, or nine million pixels.

    Sounds like overkill to me. I mean, I'm used to play my games @ 1280x1024 and i feel this resolution, maybe combined with a wee bit of AA, does the trick.

    I'd rather see all that horsepower invested in more frames/sec or cool effects. I know, it's cool to have the capability, but it makes me wonder about what another user posted here regarding the 8800 being a 700$ paperweight 'cause of early adoption. You'll have a card capable of a gazillion pixels on a single frame, yet no monitor capable of showing it fully, and when finally the monitor comes out or achieves a good price/value relationship, your card is already obsolete. Null selling point there for moi.

    Just my "par de" cents.
  • Re:Resolution (Score:5, Insightful)

    by CastrTroy ( 595695 ) on Tuesday June 26, 2007 @03:23PM (#19653807)
    3DFX thought the same of 32 bit graphics. They were still making 16bit cards when everyone else was doing 32 bit. In reality they got killer performance from doing 16 bit, blowing every other card out of the water in 16 bit performace. Most of the cards that had 32 bit couldn't even run most of the stuff in 32 bit because it ran too slow. 3DFX didn't care that it didn't do 32 bit, because 32 bit was too slow, and didn't actually improve the game that much. Now 3DFX is gone. The problem is, is that a lot of gamers don't want to get the card that only supports 16bit graphics, or in this case only supports 1900x1280 resolution. Because they feel that they aren't getting as good of a product, even if they can't tell the difference.
  • Re:Resolution (Score:3, Insightful)

    by TheRaven64 ( 641858 ) on Tuesday June 26, 2007 @04:45PM (#19654969) Journal
    I think you're overplaying the importance of 32-bit colour. I didn't start turning it on until well after 3dfx was dead. The thing that really killed them was the GeForce. They used to own the top end of the gamer market, and kept pushing in this direction. The only difference between their cheap and expensive lines was the number of graphics processors on them, and none of them did transform and lighting. At the time, this meant that a lot of their power (and they used a lot, and generated a lot of noise and heat) was wasted because games were CPU-bound, with the slow CPU (I had a 350MHz K6-2 at the time) handling the geometry set-up. You could, I think, get better performance with a high-end VooDoo card and a beefy CPU, but it cost a huge amount more than a GeForce and a slow CPU, without much benefit.

    Missing the boat on transform and lighting was a major problem, but they also made some serious tactical mistakes, like starting manufacturing boards, and alienating their OEM partners.

  • Re:Resolution (Score:4, Insightful)

    by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Tuesday June 26, 2007 @05:25PM (#19655521) Homepage Journal

    I'm saying that although 32 bit colour wasn't all that important, I know a lot of people who thought that 3DFX had terrible cards simply because they didn't support 32 bit.

    Well, speaking as someone who was living in Austin amongst a bunch of gaming technerds, no one I knew gave one tenth of one shit about 32 bit graphics. In fact, while 3dfx was on top, you could instead get a Permedia-based card which would do 32 bit, and which had far better OpenGL support (as in, it supported more than you needed for Quake) and which was just a hair slower :) I was the only one who had one amongst my friends, and I only got it because I was tired of the problems inherent to the stupid passthrough design.

    No, what made the difference was the Hardware T&L of the geforce line. That was THE reason that I and all my friends went with one, and THE reason that nVidia is here today, and 3dfx isn't.

    No one has yet adequately explained what the hell ATI is still doing here, but it must have something to do with having been the de facto standard for mobile and onboard video since time immemorial (until Intel decided to get a piece of these markets.) Practically every laptop I've owned with 3D acceleration has, sadly, had an ATI chip inside. And usually they do not behave well, to say the least...

The only possible interpretation of any research whatever in the `social sciences' is: some do, some don't. -- Ernest Rutherford

Working...