Crysis 2 Update a Perfect Case of Wasted Polygons 159
crookedvulture writes "Crytek made news earlier this summer by releasing a big DirectX 11 update for the PC version of its latest game, Crysis 2. Among other things, the update added extensive tessellation to render in-game elements with a much higher number of polygons. Unfortunately, it looks like most of those extra polygons have been wasted on flat objects that don't require more detail or on invisible layers of water that are rendered even in scenes made up entirely of dry land. Screenshots showing the tessellated polygon meshes for various items make the issue pretty obvious, and developer tools confirm graphics cards are wasting substantial resources rendering these useless or unseen polygons. Interestingly, Nvidia had a hand in getting the DirectX 11 update rolled out, and its GeForce graphic cards just happen to perform better with heavy tessellation than AMD's competing Radeons."
Hmmmm. (Score:5, Insightful)
So you're saying that a graphics card company just *might* have tried to get a company writing a largely-used benchmark in their favor.
Not that it's ever happened before... *coughintelnvidiacough*...
Re:Hmmmm. (Score:4, Informative)
It's worth noting that most benchmarks use a certain version of popular games. If next version breaks benchmark functionality in a significant way, testers simply continue using old version.
Then again, has crysis 2 ever been used a serious benchmark? The game actually looked worse then crysis (especially warhead) in terms of graphics in spite of having higher polygon counts and such, and was designed from ground up to work on machines that would never be able to run original crysis or warhead (current gen consoles).
Re: (Score:3)
Re: (Score:2, Interesting)
I'm not convinced. I'll have to talk to my friends in DX development to give me the final nod one way or another but I know this author is clueless about the subject.
There are a lot of times in computer graphics where something is seemingly wasteful--but is the most efficient solution.
For example the claim that "This is the most detailed parking barrier in cinema or game history" is untrue. Pixar's Renderman renderer at least for now is still probably the most popular renderer in VFX. For every pixel it
Re:Hmmmm. (Score:5, Insightful)
I work in games. You sir, are an idiot. Are you seriously comparing a game engine to RenderMan? We have to render a game's frame in 16ms, you have to render a frame in something less than a minute. I read the entire article. This was clearly a patch meant to appease the PC gamers into thinking that it wasn't a shoddy console port.
Re: (Score:2)
Re: (Score:2)
They're just being dumb - or favouring NVidia. The tesselation support is designed to make it pretty much trivial to adapt tesselation levels based on distance. While NVidia cards can cope with ludicrous levels of tesselation and polygons, ATI cards can't - and the penalty NVidia users pay for getting this support is that their hardware offers worse price/performance on everything else, which is why NVidia are so keen for all games to use this.
(There have been similarly fishy things before. For example, som
Re: (Score:2)
Using a negative frustrum-exclusion algorithm to selectively render only certain parts of the view graph is really common. It's trivial to eliminate things from the render pipeline based on visibility.
Re: (Score:2)
Frustum culling removes parts of the scene outside your field of view, it does not remove parts based on visibility, for that you have to go into occlusion queries, which are a whole lot more complicated. I however seems to remember seeing screenshots of occlusion queries being used in Crysis 1 to cut down on the amount of water being rendered, so it looks a bit weird seeing them wasting so many polygons here on water that is neither visible or even needed, given that it would be invisible the whole time.
Re: (Score:2)
What isn't so obvious is why CryTek would go for it.
Because the game had already been out for several months and had long since peaked in sales? Because the money/support/whatever that Nvidia offered way outweighed any potential profits from further PC sales?
Not surprised (Score:5, Interesting)
One thing I learned from writing video drivers is that game developers are probably the very last people who should be developing graphics engines. We were constantly amazed by the insanely performance-sucking tricks they used to play which we then had to detect and work around; often their poorly-designed method of rendering something would be 10-100x slower than a sensible implementation.
Valve and id are the most obvious exceptions; I don't think we ever found them doing anything really retarded unlike certain big name developers I could mention.
Re: (Score:1)
Not that I don't believe you... I have (many) reasons to... but a technical example of bad optimization would be nice. It's boring reading endless amounts of non-substantiated claims on Slashdot.
Re: (Score:1, Offtopic)
Tsk tsk tsk!
The internet isn't as "safe" as all of you gossipers, want us to believe. That thing there in the GP is a username, '0123456' and he just gave out two company names 'id' and 'valve'. If you know the USA or have read any lawsuit nightmares here, then you know that giving out his any more triangulating info gets too specific.
Anyone who worked with him on that section of code will notice today, or on a google search a month from now, and is bound to be drawing eyeballs to quote us here for other f
Re: (Score:1)
If the GP wants anonymity then they should post as AC or STFU. Their +5 Interesting comment is completely devoid of any valuable information. Apparently some developers wrote inefficient code at some point in time. Whoa, you're blowing my mind.
Valve just does bad hacks elsewhere (Score:1)
The one I've noticed the most is sound stuttering. Half-Life 2 had some bad sound stuttering issues back when it came out. Valve swore up and down it was a soundcard issue, not their engine. Well, it still does it today on a completely different (and stupidly powerful) system as does Team Fortress 2. It isn't horrible, but it is noticeable and there's no excuse give that other games don't do it and my system is extremely overpowered compared to what the games need.
Re: (Score:1)
valve has always blown ass in sound, I meant WTF was that crap in half life one, it sounded like someone got drunk and made a proof of concept for pc speaker, 2 decades late, on a sound blaster live
Re: (Score:2)
Re: (Score:1)
Oh, yEAh?
Re: (Score:1)
One thing I learned from writing video games is that driver developers often don't know much about real-world performance. ;-) Much of the performance advice we have seen given by GPU teams in the past had zero benefit to game performance and took weeks of developer time to implement and maintain. On the other hand, sometimes you come across a real gem.
Short version: good programmers good, bad programmers bad. Sometimes what is good for one case is not good for another case.
Re: (Score:2)
Much of the performance advice we have seen given by GPU teams in the past had zero benefit to game performance and took weeks of developer time to implement and maintain.
One thing worth noting is that a change that makes no difference on the card you're testing with may make the difference between the game being playable or a slideshow on a different card.
One particularly amusing issue I remember was with a new feature in Direct3D where I believe we were the only people who supported it in hardware at that time and everyone else emulated it in software; we got a new game from big name game developer X and it ran vastly slower on our card than on much less powerful systems.
Re: (Score:1)
One thing worth noting is that a change that makes no difference on the card you're testing with may make the difference between the game being playable or a slideshow on a different card.
Absolutely true. My anecdotes above were in regards to very specific hardware so this comment doesn't really change what I'm saying, but it's an important thing to understand in a general sense.
One particularly amusing issue I remember was with a new feature in Direct3D where I believe we were the only people who supported it in hardware at that time and everyone else emulated it in software; we got a new game from big name game developer X and it ran vastly slower on our card than on much less powerful systems. The idea was that you'd enable this feature once and then keep using it, but they were turning it on and off hundreds of times in a frame and each time that caused a major pipeline stall in our hardware. So once we figured that out we just detected the game and dropped back to software emulation like everyone else, but if they'd known what they were doing the game would have worked fine on all cards and been faster on ours because they'd actually have been using the hardware instead of the CPU.
To be fair, you're accusing the dev in question of not optimising for your card when you admit that the card in question was unusual and probably released after the game in question was developed- otherwise you probably would have worked with them to improve their software? It's all well and good to say "they should
Re: (Score:1)
To be fair, you're accusing the dev in question of not optimising for your card when you admit that the card in question was unusual and probably released after the game in question was developed- otherwise you probably would have worked with them to improve their software?
on the other hand, it's pretty well known that issueing unecessary state changes to 3d apis is bad and can be costly. So even if they didn't know the extent of the problems it caused on that particular card, enabling and disabling something for no good reason a hundred time per frame is bad.
Re: (Score:1)
on the other hand, it's pretty well known that issueing unecessary state changes to 3d apis is bad and can be costly. So even if they didn't know the extent of the problems it caused on that particular card, enabling and disabling something for no good reason a hundred time per frame is bad.
Agreed- however in the (distant) past we've had to do exactly this because of bugs in the driver state caching. I've also seen Cg hitting state changes fairly hard on some platforms- there was an optimisation to prevent this but it used to cause memory leaks. It can be difficult to know exactly what's going on under the hood there and you can't really blame the application developers for this without knowing the specific circumstances.
Re: (Score:2)
Perhaps though the reason is what is apparent from this article. It seems the developers had a (cash obviously) incentive to make one manufacturer's card look better. While they could optimize for that manufacturer, most sensible optimizations could possibly benefit the other manufacturer too and finding optimizations that would work much better on your preferred manufacturer would be too hard to do.
So, what if you know there is a particular function that is very slow on the manufacturer you want to show in
Re: (Score:2)
Re:Not surprised (Score:4, Insightful)
Valve and id are the most obvious exceptions
Try saying that to John Carmack.
I think that was the point. Mr. Carmack works for Idthesda, and Valve's Source engine is based on GoldSrc, which in turn was forked from the engine of Quake (early Id Tech 2) written by Mr. Carmack.
Re: (Score:2)
Well, I've got to say that Rage (on which Carmack spent the last six years or so implementing a "megatexture" hack that was worth maybe a couple of months) looks like crap compared to Crysis. Everything looks smeary (marketed as "painterly" and "atmospheric").
And while Crysis may waste polygons, Rage doles them out like a miser - the main character's head has visible lumps - it's actually even pointed. His big, round shoulder pads get about half a dozen polygons each - you can see the corners and seams of t
Re: (Score:1)
Anyways, the new Doom was the same thing. Everyone raved about how great it looked, but to me it looked like a bunch of plastic dolls. Might be highly detailed textures, but when they look like waxy plastic models it makes it worse than previous attempts.
Re: (Score:3)
Only to lose business to a competing product which does optimise it's drivers? What kind of arse backwards logic is that?
Re: (Score:2)
If you weren't doing things ass-backwards and developing workarounds in drivers for individual games then they would be forced to do things properly wouldn't they?
That was always my argument, but then people would stop buying our cards and buy cards where the game ran 'properly'.
Re: (Score:2)
Ah yes, Dark Age of Camelot. Good times!
Never attribute to malice...? ha! (Score:2, Interesting)
Never attribute to malice that which is adequately explained by stupidity
It's entirely possible that the tessellation is per-node. E.g. in the case of the barrier, only the top seems to benefit in that the handles jut out (why those handles aren't polygons to begin with is another question, given that it would take only 8 or so for each.. hardly making a dent in polygon budgets), but it's the entire thing that gets tessellation applied. Similarly, unseen parts get tessellated (why there is water underneat
Exactly. Perhaps a better phrase... (Score:2)
A title with the triple-A budget of Crysis 2 wouldn't have developers that never once bothered to view the map in wire frame at some point before release.
Take a look at all the TWIMTBP/Nvidia logos slapped all over the game and you know who is paying the bills.
Re: (Score:3)
Re: (Score:2)
Mybe Crytek was in a hurry to rush out these features and the team was confident to reach acceptable framerates on a hurry.
I've seen a presentation of a lot of the features the Crysis 2 DX11 patch adds to the game in action in the engine over a year ago. But I get the impression that the Crysis 2 art assets were not initially created to support them. So they took some assets after the fact and touched them up.
Maybe the water was so low on polys before they added tesselation that they didn't bother to cull i
3D ready (Score:5, Interesting)
Re: (Score:3)
Damn right, in a 4D display you could see them, even if they're underground!
Re: (Score:2)
In a 4D display the game would have to render the objects that will be there at some future point, too.
Re: (Score:2)
Re: (Score:1)
I know your joking, but the current state of holographic protection is a convex mirror, you can in fact replicate it with a chrome popcorn bowl and a flashlight
What a minute (Score:3, Funny)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I've got it in a bargain for $15, played singleplayer and it wasn't that bad. Didn't play prequel, but this one played pretty good. For $15 I would say it was a fair price. If I were to buy it for more, then I would pass it, just as I did on its launch. Its too bad tho that it ended so quick, wanted more...
Re: (Score:2)
Wait a minute. Not what.
Not surprised (Score:2)
Crysis's claim to fame was that it gave the GPU a real workout, and it did. They ended up rendering a whole world of extra detail to make a realistic looking environment. Along comes Crysis 2 and frankly I am not at all impressed. On a computer that has no troubles handling any other game I had to drop the quality settings to ultra ugly to make the game playable. I'd prefer less pretty garbage on the screen then having to play a game at a resolution where the pixels are the size of a man's fist.
It just seem
Re: (Score:2)
Crysis 2 does not look bad even without this DX11 patch but the game play does suck.
However Crysis 2 does kill the claim from people that they want a game with gameplay not graphics.
Re: (Score:2)
Crysis 2 didn't look any better than Crysis 1 to me. If anything, it may be a step backwards. And Crysis 1 was unquestionably the better game - it had a better storyline, more varied and bigger levels (Crysis 1 was an on-rails shooter but the levels were so wide open that you didn't notice it - Crysis 2 was an on-rails shooter and that fact was in your face the entire time), and actual vehicle combat which was almost done away with in Crysis 2. The only improvements in Crysis 2 were the controls and maybe t
Re: (Score:2)
Oh and about performance, if anything Crysis 2 is less demanding than Crysis 1. In Crysis 1 with the settings maxed (except AA, I run 8xAA because 16x crashes for some reason) it runs close to 60fps but drops down to 40-something in some scenes. In Crysis 2 with the settings maxed I was getting a solid 60fps+ with no slowdown. This is with twin GTX260 Superclocked cards, an i7 940 and 12GB DDR3.
As someone who works at AMD (Score:1)
and who's views don't represent that of the company in any official capacity, this pisses me off.
I don't believe for a second it was an accident. This is bare knuckles marketing pure and simple and I'm glad it's getting some attention.
buzz off bozo (Score:1)
Re: (Score:2)
Whiny (Score:1)
Re: (Score:1)
What do you mean? Gamers have acted entitled for years. They whine and cry when they don't get every last thing they want in the way they want it, and for free to boot, and that's been the case since the 90s. The only way it could get worse is if they decided they deserve to be paid to deign to play the games.
Re: (Score:1)
console gamers whine and cry, pc gamers vote with their money, if the game is good they spend it, consolers buy whatever crap fad company X pushes and whine when kinect is not nanosecond perfect. pc gamers want bigger better graphics they buy a fucking video card, consolers on the other hand whine for half a decade about not having AA in 720 P and then buy another 6 games in 4 months.
or in other words you have your story backwards
Re: (Score:2)
Unintentional self-parody.
Re: (Score:2)
console gamers whine and cry, pc gamers vote with their money
Tell me, what color is the sky in your world? It's blue in mine.
Re: (Score:2)
So I don't argue about playing them.
hey fucktard (Score:1)
if you do not feel the same when someone attempts to deceive and fraud you, you are a moron of the first order and i have a bridge to sell you.
Re: (Score:2)
Dont act like a moron. (Score:1)
the problem here is, nvidia used some programming gimmicks to make their cards perform better by creating extra load in polygons that are rendered UNDER WATER UNDER LAND, and therefore INVISIBLE.
no benefit to gamers here. no benefit to anyone. NOONE WILL BE ABLE TO SEE WHAT IS BEING RENDERED.
however, this will create extra unnecessary load in a way that some nvidia chips can handle better, and show compe
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
in case you forgot - ati is not a CARD manufacturer. it is a chip manufacturer. video cards are just like motherboards - you have to buy proper quality card.
sapphire for example. or asus. you cant go wrong with these. while other brands may have problems, you may overclock sapphires to oblivion. which was what i did in age of conan for example.
Re: (Score:1)
so its nvidia's fault that a cheap 1 hit wonder tech demo company does not know how to make a decent game
right
Re: (Score:2)
Re: (Score:2)
Sure as hell wasn't FarCry2, the gameplay in that was just awful. I couldn't push myself to play any more than about 5 hours. I just had to clear out the same outpost of militiamen one too many times.
Re: (Score:2)
FarCry 2 was not created by Crytek. Ubisoft owns the FarCry franchise, but lost Crytek to EA after FarCry was published. FarCry 2 and the upcoming FarCry 3 are made by completely different studios.
Not really a big deal (Score:1)
As best I can tell this essentially boils down to retrofitting directX 11 to an already designed engine after the fact and doing so in a limited time frame.
I don't really see it as a big deal as a) the game was originally designed for directX9 hardware so anybody trying to run the game on DirectX11 hardware will probably do just fine anyway and secondly the way that modern graphics cards are designed this extra geometry generated on GPU may not even be the bottleneck.
I think that engines that will really ta
Well duh. (Score:2)
tessalation of flat surfaces (Score:3)
makes the shading on them look different, so it's not all wasted vertices(well, depending on how they calculate the shading). but you can easily test that on some modeller, make a cube that has each side made of two triangles, observe how it's shaded with basic opengl shading - now, turn on some tessalation(while keeping the shape as it is) on it, and you can see the difference, you can see highlights on flat surfaces even without applying some fake phong technique.
this or any graphics upgrade doesn't help with crysis lacking in complexity due to launch on consoles though so who cares - the memory and cpu available for the game logic was dictated by that.
Re: (Score:3)
makes the shading on them look different, so it's not all wasted vertices(well, depending on how they calculate the shading).
Uh, nope. Tessellation changes the vertices, while shading is done in the fragment/pixel shader. Those are different stages in the pipeline. The graphics card automatically does linear interpolation of positions, normals, UVs, etc. between the vertices, which is what you want in all cases that I'm aware of.
Your modeller probably has the most basic material applied (for performance), maybe even Gouraud shading [wikipedia.org], so that's not really a good reference. CryEngine3 uses highly advanced shaders, where things like
Re: (Score:2)
*(well, depending on how they calculate the shading)*
and well, you wouldn't want it always, the artist should choose the normals on the model for the desired look, if the normals point outwards from the surface there won't be a hard edge at the corners, but slap some tessalation and you can then all the normals averaged from face normals don't make the shading(not shadows) look so wrong anymore at corners. so what i'm getting at is that tessalation can change the normals you're interpolating between. if tha
Re: (Score:2)
Sure, you can do shading like that, if you're only using flat shading - a very ancient technique that was made virtually obsolete when pixel shaders hit the mainstream. A proper shading algorithm does not need distinct polys to apply gradients or shadows, it simply calculates the proper lighting value for each pixel. That's why today's GPUs have hundreds of those tiny processors. To do the same via polygons would require prohibitive amounts of memory and just as much processing, since you would wind up w
Too many polygons, so what? (Score:2)
Games use too many polygons, so what? They also use too much RAM, too much disk space, and too much processing power in general.
The important thing in video games is making them work, not making them optimal.
Why is tessellation done everywhere even on relatively flat stuff? Because the development team did not waste time studying each object one by one, the tessellation aspect was computer-generated for everything.
read first, moron. (Score:1)
basically its a hidden object that favors some nvidia chips, and makes the competitor cards get choked.
its fraud.
Re: (Score:1)
Re: (Score:2)
Re: read first, moron. (Score:2)
I did read TFA.
It's a missed optimization. Are optimizations compulsory now?
If ATI cards can't deal with a higher computation load, it's just because they're not as good, that's all there is to it.
Re: (Score:2)
Oops, wrong place in thread. Can someone remind me why slashdot still doesn't allow editing or deleting posts?
EA can't do anything right (Score:2)
It took the game's consolified low-poly meshes and prorgammatically inflated the poly count, barely enhancing the image quality at all, while driving processing requirements way up. It looks more like one of those "repeat N 1000 times" benchmarks, the kind I write when trying to find bottlenecks in a web page, than any sort of effort to make Crysis 2 not look like the steaming EA-published turd it is.
Crysis 1 still looks better than this half-assed sequel, DX11 be damned.
If this is some bizarre partnership
Slashdotted ? In 2011 ? (Score:2)
Techreport seems to be slashdotted. Am I being too harsh, or is that horribly embarrassing for a site that focuses on performance testing and overclocking ?
Just sayin...
Re: (Score:2)
*This picture brought to you by Cuil*
Comment removed (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
... for no fucking reason ...
... see the fricking water!
I find your inconsistent application of profanity appalling...
Re: (Score:2)
Actually, I thought it showed a remarkable gradation of emotion. Of course, I had a great-uncle who could curse for 2 1/2 minutes straight without repeating himself. It gave me an appreciation for the art. ;-)
Re: (Score:2)
Re: (Score:2)
The only time I heard him do it, I was probably 6 or 7, just before he died. Two languages that I'm sure of; Canadian French (from his mother) and American English. Since he grew up in a logging town in Northern Minnesota in the early 1900s, my guess is he must've picked up some Norwegian, Swedish, Italian, and who knows what all else. I sure didn't. :)
Re: (Score:2)
Have you EVER said to yourself 'Boy this game would have totally had me if it had only rendered the concrete dividers in such loving detail I can make out the scuff marks from the boot of the guy who last leaned on it'?
It certainly would make "concrete divider cleaner 2000" more realistic.
Re: (Score:2)
You have a highly intensive programming trick used for no fucking reason on completely stupid shit like concrete dividers.
Seems especially pointless as in the screenshots the divider is right next to some poorly rendered leaves that look flat and unrealistic.
Re: (Score:2)
Re: (Score:1)
but maybe you like getting frauded. thats your preference and i respect it.
Re: (Score:2)
ATI is the one trying to trick you.
Their cards have less computing power, but they make up for it with nifty tricks that only work in certain cases. As soon as you get out of these idealized cases, performance drops dramatically.
Re: (Score:2)
ati is giving people what they need - medium power, low energy consumption, efficient chips, and you can put as many as you want. the machine i am on has 2 x 5670 cards as of this moment. they are cheap middle-middle segment cards, despite being decent. however, when crossfired, they perform like a 5770 card,
Re: (Score:2)
Won't even bother reading. My opinion is clearly the better one.
hehe.
Re: (Score:2)
Come on!