NVIDIA On Their Role in PC Games Development 92
GamingHobo writes "Bit-Tech has posted an interview with NVIDIA's Roy Taylor, Senior Vice President of Content/Developer Relations, which discusses his team's role in the development of next-gen PC games. He also talks about DirectX 10 performance, Vista drivers and some of the upcoming games he is anticipating the most. From the article: 'Developers wishing to use DX10 have a number of choices to make ... But the biggest is whether to layer over a DX9 title some additional DX10 effects or to decide to design for DX10 from the ground up. Both take work but one is faster to get to market than the other. It's less a question of whether DX10 is working optimally on GeForce 8-series GPUs and more a case of how is DX10 being used. To use it well — and efficiently — requires development time.'"
Just one question (Score:2)
FTFA:
Conjecture aside, what refresh rates are they using now?
I would have as
Re:Just one question (Score:4, Informative)
http://en.wikipedia.org/wiki/QXGA#WQUXGA [wikipedia.org]
Apparently, the existing monitors at WQUXGA (worst. acronym. ever.) resolution run at 41hz, max. These days, top of the line game systems will pump out upwards of 100 frames/sec in some cases. A 41hz refresh rate is essentially caps you at 41 FPS, which is enough to turn off any gamer looking at blowing that much on a gaming rig.
Re: (Score:2, Informative)
41 FPS for display purposes. However, many time physics/AI/etc is done "per-frame." A higher FPS will still affect those (moreso since any decently threaded game will use less resources rendering.) Most display systems cannot handle 100Hz, and most humans cannot tell the difference above 25-30 Hz. It's only games where slow displays lead to slow calculated frames that this will cause a problem. That and arrogant SOB's who claim they can tell the difference without FRAPS.
Plus, at that resolution you ar
Re: (Score:2, Informative)
*sigh* Where do people come up with this garbage? Look at some evidence already instead of making stuff up:
http://mckack.diinoweb.com/files/kimpix-video/ [diinoweb.com]
Re: (Score:1)
Re: (Score:2)
I've checked various framerates. I cannot tell the difference above around 18-20 Hz (when paying attention. I've played games at 12-15 FPS without noticing anything wrong), but I recognize others can. In movies/animation/etc. 24/25(PAL)/29.97(NTSC) are standard.
Re: (Score:2)
Re: (Score:1)
Starwars galaxies? Yea - that's fine at 15-30
Above 30 I will admit, I don't really notice too much. It's more of a smoothness thing though. If 30 is as low as it goes, I won't complain too much, but if that's as high as it goes, or even the average - it probably dips much lower... and that IS noticable.
Re: (Score:3, Funny)
Yeah, but think of the points you could rack up in Scrabble.
Re: (Score:2)
That's 1/4 of WQUXGA's 3840x2400
Black bars (Score:2)
What bothers me more is that the screen uses 16:10 aspect ratio. Seems Apple is quite fond of 16:10 for some reason (according to that link). I hate 16:10.
In principle, 16:10
Re: (Score:1)
That would be people who have purchased an Nvidia card, who happen to be gamers, who happen to have registered their hardware, who happen to have responded to an e-mail from Nvidia requesting they complete a questionnaire.
That's, what... all of 100 people?
(Ok, so I fall into that category... but I was the one who responded as a pays for one or two games a year... but I play them every day for a long time.)
Re: (Score:2)
(Ok, so I fall into that category... but I was the one who responded as a pays for one or two games a year... but I play them every day for a long time.)
I envy you, as I'm the polar opposite. I rarely finish a game (I'm just now getting closer to finishing the original HL2) but can't stop myself from getting excited about--and subsequently buying--the latest "ooh shiny!"
I suspect it probably points to a fairly fundamental personality trait: I enjoy novelty and learning new systems, but get bored very easily with working my way through levels. I also thoroughly enjoy reading game manuals, again because it's new information. Still, at least my foible
Re: (Score:1)
And own up, how many emulated games have you loaded up only to go 'nice, it works' then move on the next one...
Heh. (Score:3, Funny)
Re: (Score:1, Offtopic)
Besides, a 8800 GTX is a very good card. I chose a GTS because of power usage and price. But if you bought a $700 highend card without actually wanting it, then you are to blame.
Re:Heh. (Score:5, Interesting)
Now NVIDIA is basically advising developers to proceed with caution in DX10 implementations.
Nice.
Re: (Score:2)
Re: (Score:2)
DX10 is Vista only. You have to look at the market share. There are a lot more XP machines than Vista. If you write to DX9 your potential market is about I would guess 100 times the size of a Vista only game.
Notice that Microsoft's Flight Simulator 10 was written for DX9.
But thanks buying a bleeding edge card. In three years when I pay $200 for my DX10 card it will probably be faster than your $800 card. With out people like you the rest of us wouldn't get to buy good ca
Re: (Score:2)
Ehhh... thats possibly the most idiotic saying I've ever read on
Stupid pioneers like Thomas Edison (idiot! I pay $.53 for lightbulbs today), Neil Armstrong (duh, like the moon taught us anything!) or even the British/Spanish explorers (Retards! I was born in America, hahaha, idiots risked their lives to sail boats over here).
But, no no, I'm sure you're little saying is applicable somewhere... yeah.
Re: (Score:2)
Do you use an Altair PC? Fly on planes made by Curtis Wright Aircraft? Use Visicalc for your Spreadsheets?
Armstrong wasn't the pioneer that would have been Robert Goddard.
Going first always has costs and risks and more often then not it pays off for the people that go in to a land or market second or third.
IBM wasn't the first to produce a computer they followed Sperry. Apple and IBM where not first with home computers or PCs they where following Altar and IMS
Re: (Score:2)
Who where the first to cross the Atlantic? The Vikings and the Spanish.
Who was the first to cross the Pacific?
The English did do some exploring but they where not pioneers for the most part. They came and settled the lands that other had "found".
That was their great achievement. Being the first to walk on some hunk of land is nice for getting your name in the history books but living on it and living well is the real achievement.
The English know about pioneers getting slaughtered
Re: (Score:2)
Re: (Score:2)
Are you running that 8800GTX on Vista or XP?
Because I have to tell you, I'm ALSO an early adopter of the 8800GTX. I run XP (screw Vista) and I couldn't be happier. It was worth every single penny. I haven't had a single problem with it whatsoever.
I run all my games at 1900x1220 Resolution at maximum detail levels, and they are all gorgeous. I don't have any performance issues at all.
If you have yourself a $700 paperweight, you've got something else going wrong besides the card i
Re: (Score:1)
Yay Vista!
(duck)
Re: (Score:2)
Leaving aside cost of the O/S, i have nothing against Vista(I use Home Premimum) at the moment. I'm using the NVIDIA 158 BETA drivers for my 8800GTS which are extremely stable with exceptional(compared to the early release drivers) performance. If i had to say anything positive about Vista and gaming, it's that loading times for games and game levels were almost halved after i moved from XP to Vista. Your mileage may vary th
Re: (Score:2)
Re: (Score:2)
Re:Heh. (Score:5, Funny)
Re: (Score:2)
For the first month or so, I dual-booted XP, but since the middle of March I've been running Vista only, and played Vanguard, Company of Heroes, LotRO, Civ4 and a bunch of other stuff with almost no issues. Except for Va
Re: (Score:3, Funny)
You're doing it wrong.
Re: (Score:2)
The day I get a $700 Christmas gift (much less a pre-order) is the day my wife wins the lottery. If I get a pre-order graphics card from her, I know the aliens have truly infiltrated earth and replaced my wife with a brain eating monster. She thinks I play games too much as it is - about 10 hours a week - certainly not the 10 hours in a day I did sometimes in college (I was a binge gamer
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:2)
No, I'm not saying these things are putting out any less heat than the previous model. In fact, based on the power draw requirements (it takes 2 PCI-Express power connectors instead of 1 like most cards), I would guess this thing generates a lot more heat.
What is better about the 8800GTX compared to my prev
Re: (Score:2)
Of course, you'll only find it on the high-end cards, because those are the only cards where they can actually afford a quality cooler. Stock midrange cards use the cheapest coolers manufacturers can find, and you have to pay extra for a good cooler (or passive cooling solution).
Snippets from the article (Score:3, Insightful)
Didn't ATI kick out some DX10 hardware the other day? I'm sure the ATI x29xxx is DX10.
"Our research shows that PC gamers buy five or more games per year, and they're always looking for good games with great content.
Interesting, but makes me wonder what they lay in the definition PC gamer.
"Tony and David are right, there are API reductions, massive AA is 'almost free' with DX10. This is why we are able to offer CSAA [up to 16xAA] with new DX10 titles - the same thing with DX9 just isn't practical. Also interesting, but I'm skeptical. Turning on AA is just one API call, how does AA affect overhead?
"So yes we will see big performance jumps in DX10 and Vista as we improve drivers but to keep looking at that area is to really miss the point about DX10. It's not about - and it was never about - running older games at faster frame rates. Wait, rewind. Are he saying my DX7/8/9 games will run faster once Nivida gets their DX10 drivers together? Or is he saying games with DX9 level of graphics will run faster if ported to DX10?
"Five years from now, we want to be able to walk into a forest, set it on fire and for it to then rain (using a decent depth of field effect) and to then show the steam coming off the ashes when the fire is being put out."
No, I can do that in real life. A Pyromaniacs VS firefighters burn fest OTOH....
Re: (Score:2)
Re: (Score:1)
I'm wondering if this has more to do with an architectural change than just a software modification. Maybe DirectX 10 specifications just require the board to have a daughter die similar to what the graphics processor in the 360 has.
Re: (Score:2)
I'm wondering if this has more to do with an architectural change than just a software modification. Maybe DirectX 10 specifications just require the board to have a daughter die similar to what the graphics processor in the 360 has.
Well, according to nVidia [nvidia.com]:
The method of implementing CSAA in DX9 differs from DX10. This is due to a limitation in the DX9 runtime, which prohibits the driver from exposing multisample quality values greater than 7. For this reason, instead of specifying the number of coverage samples with the quality value, we simply set quality to a predetermined value which will be interpreted as a specific CSAA mode by the driver.
So there. It looks like it's just as possible under DX9 but you can't give your devs the warm fuzzy glow of going "set supersampling to 11!"
Customers are funny... (Score:3, Funny)
Customers are funny, if you ignore them long enough eventually they go away.
Resolution (Score:3, Insightful)
WQUXGA, 3840x2400, or nine million pixels.
Sounds like overkill to me. I mean, I'm used to play my games @ 1280x1024 and i feel this resolution, maybe combined with a wee bit of AA, does the trick.
I'd rather see all that horsepower invested in more frames/sec or cool effects. I know, it's cool to have the capability, but it makes me wonder about what another user posted here regarding the 8800 being a 700$ paperweight 'cause of early adoption. You'll have a card capable of a gazillion pixels on a single frame, yet no monitor capable of showing it fully, and when finally the monitor comes out or achieves a good price/value relationship, your card is already obsolete. Null selling point there for moi.
Just my "par de" cents.
Re:Resolution (Score:5, Insightful)
Re: (Score:1, Interesting)
Re: (Score:3, Insightful)
Re: (Score:2)
Re:Resolution (Score:4, Insightful)
Well, speaking as someone who was living in Austin amongst a bunch of gaming technerds, no one I knew gave one tenth of one shit about 32 bit graphics. In fact, while 3dfx was on top, you could instead get a Permedia-based card which would do 32 bit, and which had far better OpenGL support (as in, it supported more than you needed for Quake) and which was just a hair slower :) I was the only one who had one amongst my friends, and I only got it because I was tired of the problems inherent to the stupid passthrough design.
No, what made the difference was the Hardware T&L of the geforce line. That was THE reason that I and all my friends went with one, and THE reason that nVidia is here today, and 3dfx isn't.
No one has yet adequately explained what the hell ATI is still doing here, but it must have something to do with having been the de facto standard for mobile and onboard video since time immemorial (until Intel decided to get a piece of these markets.) Practically every laptop I've owned with 3D acceleration has, sadly, had an ATI chip inside. And usually they do not behave well, to say the least...
Actually, it's simpler (Score:4, Informative)
In a nutshell:
1. 3dfx at one point decided to buy a graphics card manufacturer, just so, you know, they'd make more money by also manufacturing their own cards.
2. They missed a cycle, because whatever software they were using to design their chips had a brain-fart and produced a non-functional chip design. So they spent 6 months rearranging the Voodoo 5 by hand.
The Voodoo 5 wasn't supposed to go head to head with the GeForce 2. It was supposed to, at most, go head to head with the GF256 SDR, not even the DDR flavour. And it would have done well enough there, especially since at the time there was pretty much no software that did T&L anyway.
But a 6 month delay was fatal. For all that time they had nothing better than a Voodoo 3 to compete with the GF256, and, frankly, it was outdated at that time. With or without 32 bit, it was a card that was the same generation as the TNT, so it just couldn't keep up. Worse yet, by the time the Voodoo 5 finally came out, it had to go head to head with the GF2, and it sucked there. It wasn't just the lack of T&L, it could barely keep up in terms of fill rate and lacked some features too. E.g., it couldn't even do trilinear and FSAA at the same time.
Worse yet, see problem #1 I mentioned. The dip in sales meant they suddenly had a shitload of factory space that just sat idle and cost them money. And they just had no plan what to do with that capacity. They had no other cards they could manufacture there. (The tv tuner they tried to make, came too late and sold too little to save them.) Basically while poor sales alone would have just meant less money, this one actually bled them money hand over fist. And that was maybe the most important factor that sunk them.
Add to that such mis-haps like,
3. The Voodoo 5 screenshot fuck-up. While the final image did look nice and did have 22 bit precision at 16 bit speeds, each of the 4 samples that went into it was a dithered 16 bit mess. There was no final combined image as such, there were 4 component images and the screen refresh circuitry combined them on the fly. And taking a screenshot in any game would get you the first of the 4 component images, so it looked a lot worse than what you'd see on the screen.
Now it probably was a lot less important than #1 and #2 for sinking 3dfx, but it was a piece of bad press they could have done without. While the big review sites did soon figure out "wtf, there's something wrong with these screenshots", the fucked up images were already in the wild. And people who had never seen the original image were using them all over the place as final "proof" that 3dfx sucks and that 22 bit accuracy is a myth.
Re: (Score:2)
I remember all that you speak of.
It wasn't a horrible idea, but going exclusive was.
Re: (Score:2)
Woah woah woah... you should not be comparing 16-bit vs 32bit colour to 'high resolutions'. You could easily see the quantization errors with transparency effects like smoke or skies on 3Dfx cards, you could easily tell the difference between 16-bit a
Re: (Score:2)
"Our cards are designed for playing bad and ugly games!" is not a good sales pitch...
Re: (Score:2)
Along the same lines as what got Ati in the running in the grfx market, pre 9500.
(they were "in" the market, yeah, but they did not matter, IMO until the 9500)
All of the grfx being put on screen were being drawn, even if you could not see it, say
like a house with a fence in the back yard, grass, lawn chair and other stuff that you
can't see because you are standing in front of it.
At the time that was a lot of CPU/GPU power being wasted, until the question "why" was
ask
22 bits (Score:1)
Re: (Score:1, Interesting)
Re: (Score:3, Funny)
WQUXGA, 3840x2400, or nine million pixels.
How about five letter acronyms being enough for anyone?
Re: (Score:3, Interesting)
Display of the future approaching the human eyes capabilities.
60"-80" diameter hemisphere, it will probably be oval shaped, since our field of vision is.
2 GIGApixels (equal to about a 45000 x 45000 pixel image, 1000x
Re:Resolution (Score:4, Informative)
You say this like it means something. It does not. Here's why.
The real world is based on objects of infinite resolution. Our vision is limited by two things; the quality of the lens and other things in front of the retina, and our brain's ability to assemble the data that comes in, in some fashion useful to us that we perceive visually.
A lot of people make the mistake of believing that the finest detail that we can resolve is in some way limited to the sizes or quantities of features on the retina. This is a bunch of bullshit. Here's why; Saccades [wikipedia.org]. Your brain will use your eye muscles without your knowledge or consent to move your eye around very rapidly in order to make up for deficiencies in the eye surface and to otherwise gather additional visual data.
Have you ever seen a demo of the high-res cellphone scanning technique? There's software (or so I hear, I saw a video once and that's all) that will let you wave your cameraphone back and forth over a document. It takes multiple images, correlates and interpolates, and spits out a higher-resolution image. (No, I don't know why we haven't seen this technology become widespread, but I suspect it has something to do with processor time and battery life.) Your eye does precisely the same thing! This leads us to the other reason that your statement is disconnected from reality; what you think you are seeing is not, repeat not a perfect image of what is before you. Your eyes are actually not sufficiently advanced to provide you so much detail if that is what it was!
No, what you think you are seeing is actually an internal representation of what is around you, built out of visual data (from the optic nerve, which performs substantial preprocessing of the retinal information) and from memories. Your brain fills in that part of the "image" for which it does not have good information from your own mind. This is why you so commonly think that something looks like something else at first glance - your brain made an error. It does the best it can, but it only has so much time to pick something and stuff it in the hole.
Stop trying to equate vision to a certain number of pixels. It's different for everyone, and it's only partially based on your hardware. Your brain does vastly more processing than you are apparently aware. Some people get to see things that aren't there all the time! (Or maybe it's the rest of us whose visual system has problems? Take that thought to bed with you tonight.)
Re: (Score:2)
Of course it is, if its not limited by the cones or rods, its limited by the scanning rate, optical nerve rate, or the rate at which saccades happen, IT IS LIMITED, its just that we don't know the current limits technically, even if we don't know how to calcuate them through biological measurements, they are very easy to measure through subjec
Re: (Score:2)
WQUXGA, 3840x2400, or nine million pixels.
Sounds like overkill to me. I mean, I'm used to play my games @ 1280x1024 and i feel this resolution, maybe combined with a wee bit of AA, does the trick.
I used to feel this way, running at 1280x1024 on a pretty decent 19" CRT. However, about a year ago I finally upgraded to a 22" widescreen LCD with a native resolution of 1600x1050 and the difference it made was astonishing. Games that supported high resolution (Company of Heroes, WoW, Oblivion, etc.) felt incredibly more open. For contrast, I recently reloaded Enemy Territory on my system, which I have to run in a 1280x1024 window because the full-screen standard resolutions look like crap on my wide-s
Speaking of Nvidia Development (Score:3, Interesting)
The thing that pisses me off is that Nvidia seems to have done this for absolutely no reason at all and Windows 2000 is still a fine operating system for me. I have no reason at all to switch to Windows XP (and hell no to Vista), I especially don't care fot the activiation headaches (I like to switch around hardware from time to time to play around with new stuff and go back once I've gotten bored with it if I don't need it, such as borring a friends Dual-P4 motherboard).
Anyway, my point/question why must Nvidia feel the need to force their customers who use their hardware for developing games into later Windows operating systems like that? Anybody got any tips on how to 'lie' or disable the windows version check to force say FX Composer 2 to install on Windows 2000? It isn't like we're talking about Windows 98 here, Win2k is a fine OS and in my opinion actually the best one Microsoft has ever done.
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Anybody got any tips on how to 'lie' or disable the windows version check to force say FX Composer 2 to install on Windows 2000?
I too would like to know if there's a way to do this.
Re: (Score:1)
Windows 2000 was probably the most stable of the user OS's I've seen Microsoft roll out. XP, sure it has a firewall and all, but the only thing I like about XP over 2000 -- the ONLY THING -- is the integration of browsing into a *.zip file. That's it. The install is four times as big and just as stable. I really never saw the need to buy XP, ever. Work environments have been my main source of exposure to it.
Linux? (Score:2, Interesting)
Open GL (Score:2)
The state of the Open GL (Score:4, Informative)
So why... (Score:2)
So why are games being written for Direct3D? Why would a developer voluntarily chain himself to a single vendor, any vendor, let alone Microsoft.
What would they be giving up by writing to OpenGL? It runs on Windows, right?
Re: (Score:2)
Re: (Score:2)
Wow (Score:2)
This is about right, when the Xbox came out, it was about on par with PCs at the time. 6 months to one year down the track, the top of the line PCs were way ahead. Now, the 360 and PS3 (which isn't living up to the hype, most of the graphics on 360 and PS3 are about the same despite the 360 being a year older) aren't competing with the t
Re: (Score:1)
Re: (Score:2)
The industry must move fo