AMD, NVIDIA, and Developers Weigh In On GameWorks Controversy 80
Dputiger writes: "Since NVIDIA debuted its GameWorks libraries there's been allegations that they unfairly disadvantaged AMD users or prevented developers from optimizing code. We've taken these questions to developers themselves and asked them to weigh in on how games get optimized, why NVIDIA built this program, and whether its an attempt to harm AMD customers. 'The first thing to understand about [developer/GPU manufacturer] relations is that the process of game optimization is nuanced and complex. The reason AMD and NVIDIA are taking different positions on this topic isn't because one of them is lying, it’s because AMD genuinely tends to focus more on helping developers optimize their own engines, while NVIDIA puts more effort into performing tasks in-driver. This is a difference of degree — AMD absolutely can perform its own driver-side optimization and NVIDIA's Tony Tamasi acknowledged on the phone that there are some bugs that can only be fixed by looking at the source. ... Some of this difference in approach is cultural but much of it is driven by necessity. In 2012 (the last year before AMD's graphics revenue was rolled into the console business), AMD made about $1.4 billion off the Radeon division. For the same period, NVIDIA made more than $4.2 billion. Some of that was Tegra-related and it's a testament to AMD's hardware engineering that it competes effectively with Nvidia with a much smaller revenue share, but it also means that Team Green has far more money to spend on optimizing every aspect of the driver stack.'"
Re: (Score:1)
Yes, but not in a way you mean it. AMD execs would likely kill to get their hands on information how nvidia optimizes their drivers. It's a well known reality that ATI had severe problems with driver quality from early stages, and these problems persist today long after AMD bought the company.
So yes, it makes perfect sense not to give software part of technology stack away to the competitor that has emphasized hardware at expense of driver quality. Hardware-wise AMD currently has better bang for a euro than
Re: (Score:3)
So it's absolutely understandable that nvidia chooses not to open the driver code.
...try telling that to Stallmen et al!
Seriously though, maybe it makes me a Bad Linux User, but I'm absolutely ok with the state of nVidia drivers: installation is a piece of cake, 2D and 3D performance is great (I think 3D performance is on-par with Windows [OpenGL, obviously]).
I don't have any experience with new ATI cards under Linux, but I've had hit-or-miss luck the times I've used slightly older cards (interestingly, I've had much better luck with 3D performance than 2D...horrible tearing/update
Re: (Score:2)
...try telling that to Stallmen et al!
I'm quite sure he would understand it, I'm quite sure he wouldn't find it acceptable though and that's his prerogative.
Re: (Score:1)
Stallman would prefer if AMD and nVidia would only compete to make better hardware. Which may or may not be realistic. I'm not sure what such a world would look like.
Re: (Score:3, Informative)
...but nvidia offers far better drivers and some extra features like physx
It's more than that. NVIDIAs drivers aren't even that good. It's just that ATI's (AMDs) are just so terrible that they look good in comparison. Who the hell decided the catalyst control center was a good idea? It reminds me of some glitchy 1990s spam ladened chat program. What a joke. The drivers are so sketchy almost every game I'd play would have "STICKY: For ATI users check here first!" at the top of their support forums. Trying to get hardware acceleration to work on my linux media PC was almost imposs
Likely a faulty chair to keyborad interface. (Score:2, Funny)
It reminds me of some glitchy 1990s spam ladened chat program.
Sounds to me like you are using a 1990's card too, AFAIK "catalyst" is no longer supported and it's certainly not bundled with recent cards. I updated my NVIDIA driver just the other day, sure the driver is enourmous (250MB) but it installed flawlessly in the background without a reboot. I play WoT regularly at maximum detail on an i7 and have no issues other than the 200ms round trip from Oz to the US but it stays playable until that hits ~350ms. I've also been mucking around with CUDA for a few months, th
Re: (Score:1)
Re: (Score:2)
Psst. Catalyst is indeed still shipped with ATI drivers, especially proprietary ones in Linux, and I think you misunderstood the GP was talking about how he's had shit luck with ATI but his nVidia drivers were fine.
Thanks! That's exactly what I was saying. :-)
Re:Likely a faulty chair to keyborad interface. (Score:4, Informative)
Sounds to me like you are using a 1990's card too, AFAIK "catalyst" is no longer supported and it's certainly not bundled with recent cards.
Not only is CCC still a thing, a bug-ridden piece of shit thing which can cause systems to crater and which amounts to 150MB for a preferences GUI, but ATI abandons cards much, much more quickly than does nVidia. Indeed, when I bought my last ATI-graphics product new in the store (so old it's a R690M-based subnotebook) it was all of the following things:
That right, it was not just obsoleted but abandoned while it was still being sold.
The nvidia driver is enormous because one download supports cards practically back to the time of Methuselah. It hasn't really been that long since they finally abandoned support for literally their oldest cards. AMD abandons them while they're still on store shelves. I don't care if it's because they're spread too thin, or just because they're assholes, or because the heavens conspire against them. It just doesn't make sense to use their graphics cards. You seem to have noticed this, as you have an nVidia card.
Re: (Score:2, Funny)
...Stop creating new cards I can cook and egg on...
I think I've found your problem. What you are looking for is called a skillet, and it does not go in the computer.
Re: (Score:1)
Heh... You'd like to think that, but you'd be mostly wrong.
The problem with "broken" drivers and "quality" is less an optimization and overall fit-and-finish deal with AMD versus NVidia- and more of something most wouldn't get unless they'd been digging in either company's proprietary codebases at some point combined with being IN the Games Industry and really understand the story properly.
AMD's drivers tend to explicitly follow the OpenGL standards. To a fault.
NVidia's compensates for many inappropriat
"Quirks mode" all over again? (Score:4, Insightful)
to me it sounds like again like the beginning of Internet Explorer vs. Firefox compliance to HTML standards.
Down to the detail of how it pans out:
- one company being the popular one (Microsoft, Nvidia), so everybody code to their platform (IE, drivers) and end up unknowingly produce bad that code that happen to rely on the peculiarities of this platform (the non standard assumption of Nvidia's drivers, the weird re-interpretation of HTML done by IE's engine). When there are problem, they tend to hack their own code.
- the other company being the underdog (Mozilla, AMD) making a platform (Firefox, Catalyst) that tries to follow the open standard to the letter (HTML5, OpenGL), but in the end other person's code (websites, code) behaves poorly, because it breaks standard and relies on quirks that aren't present in that platform. The users complain of problem (broken HTML rendering worse under Firefox than IE, non-compliant openGL code's performance being more degraded on AMD then Nvidia hardware).
Funnily, if past history is any indicator, on the long run AMD's approach is better and either them or one of their successor is bound to manage to bring opengl-compliance more important than driver tricks.
(the fact that AMD is dominating the current iteration of consoles, might help bring more power to them)
Interestingly the embed world might one also end up helping just like it did the browser wars (Internet Explorer was far less prevalent in embed machine like PDA/Smartphone/Tablet than on desktop and the problems with broken HTML became much more apparent, and compliance with HTML5 [sure to run on as much platforms as possible] was determinant. Also the embed eco-system mostly centered around compliant engine (like Webkit)) due to the same factors (extremely heterogeneous ecosystem hardware-wise, where Nvidia is just one player among tons of others with their Tegra platform. compliance with OpenGL ES is what is going to be determinant as the embed platforms are going to need a lingua franca to insure that porting an engine is as smooth as possible and works easily on all smartphones/tablets, no matter if they boast PowerVRs, Vivante, Lima, Adreno, etc.)
Maybe we might need something along Acid test and w3c conformancy test to exercise drivers and test game code for standard non-compliance.
(That partly exist as "piglit" - the test suite that freedesktop.org uses to test opensource mesa and gallium drivers).
Re: (Score:2, Informative)
AMD's drivers tend to explicitly follow the OpenGL standards. To a fault.
That is a popular excuse, especially for the open source drivers that frequently have problems with newer commercial games, but having more complete support for what is in the standard and being more permissive to what is not are not mutually exclusive. For example, see this page for some actual conformance testing results: http://www.g-truc.net/post-0655.html#menu As you can see, the Nvidia binary driver clearly passes a higher percentage of the tests than any of the others, and it is the only driver to pa
Re: (Score:2)
I'd say the fundamental problem is that the specifications themselves are a patchwork of code changes written in a natural language.
The original specification is written before the original driver code is modified, or derived from an existing driver for one hardware system, and then recoded for a new driver for another hardware system. With other device drivers (networking), each extension specification is actually specified in a high-level language which can be processed straight into device driver code.
Di
Re: (Score:2)
I would say this is a matter of opinion. The 750 Ti is a MONSTER at its price point, especially with ShadowPlay.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
First, I don't think AMD (or any other company) execs would recognize driver optimization if it hit them in the face.
Second, do you think nVidia is hiring from a different talent pool than AMD? Neither company has any special secret magic sauce driver optimizations that a well trained monkey at the other company cannot come up with. If you look into nouveau and radeon open source kernel and Mesa drivers you will be able to see how much easier nVidia hardware is to work with, that may be one of the reasons n
Re: (Score:2)
In my experience the AMD drivers&software have been more stable. Probably I just have bad luck, but just today I got a pop-up from the Nvidia Experience that there's an update. Clicked it and got 'cannot connect to Nvidia servers' or something similar. The last driver update failed to update one of the components. And the display adapter has crashed once (managed to recover though).
I'm currently using GTX 670 (got the Windforce model and it is really quiet) and I'm reasonably happy with it, but I had no
Re: (Score:2)
I don't use nvidia experience as I prefer to have manual control over most of the card's features. That said, I've had the update problems as well, and most of them were firewall-related.
Re: Optimizing the driver stack... (Score:1)
Re: (Score:1)
launch day games i've tried with no issues on my 5870 include BF3 (includling "paid beta"), Sup Com 2, Starcraft 2a, Portal 2, Crysis 2 trial, NFS: Shift, Metal Gear Rising, Borderlands 2, Skyrim*, Train Simulator 2014* (* = launch + 1 month)
only issues i've had were a few demoscene nvidia-favoring demos and bitcoin-related OpenCL driver combinations.
Re: (Score:2)
Good for you. I've had more than my share of games that worked atrociously on release date when I was still playing on 4870 and driver settings that just plain refused to work.
I'm not alone in that experience either unfortunately. It's not like it has scared me off AMD, as I said I still use it. Just not for performance stuff where I need reliability and stability more than a few extra FPS.
Re: (Score:2)
Re: (Score:2)
It's a well known reality that ATI had severe problems with driver quality from early stages, and these problems persist today long after AMD bought the company.
And by "from early stages" you mean from the beginning, I hope. I've been having ATI graphics blow up windows since 3.1 with the Mach32. Even RADIUS made more reliable video cards. I wish they'd stuck around and ATI was gone now.
GameWorks is an arcade (Score:1)
This article is very confusing to me
Sometimes things aren't done for evil. (Score:3)
maybe it's time for a new graphics api standard? (Score:2)
When opengl 1.0-1.3 (and dx5/6/7) was king, gpus were fixed function rasterizers with a short list of togglable features. These days the pixel and vertex shader extensions have become the default way to program gpus, making the rest of the api obsolete. It's time for the principal vendors to rebuild the list of assumptions of what gpus can and should be doing, design an api around that, and build hardware specific drivers accordingly.
The last thing I want is another glide vs speedy3D...err I mean amd mant
Re: (Score:2)
It's time for the principal vendors to rebuild the list of assumptions of what gpus can and should be doing, design an api around that, and build hardware specific drivers accordingly.
For the most part, they've done that. In OpenGL 3.0, all the fixed-function stuff was deprecated. In 3.1, it was removed. That was a long, long time ago.
In recent times, while AMD introduced the Mantle API and Microsoft announces vague plans for DX12, both with goals of reducing CPU overhead as much as possible, OpenGL already has significant low-overhead support [gdcvault.com].
Re: (Score:2)
Re: (Score:2)
You should look at the latest OpenGL ES specification. This is OpenGL optimized for mobile devices and gets rid of most of the old API bits while still supporting vertex, fragment and compute shaders. Anything else is just implemented using shaders.
But mantle gives you access to the hardware registers (those descriptors) while avoiding the overhead of updating the OpenGL state, then determining what has changed and hasn't, then writing those values out to hardware.
Wrong target of blame. (Score:1, Insightful)
Re: (Score:2)
Which nvidia drivers do you compile?
geforce is a binary driver and nv doesn't support 3D and is no longer supported.
nouveau isn't developed by nvidia.
Re:Wrong target of blame. (Score:4, Funny)
The part of the driver which is compiled as a kernel module to serve as adapter against the binary blob?
You thought that it wanted the linux-headers package just for the fun of reading it on its own time?
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
geforce is a binary driver
And how do you go about having it support a kernel with an unstable ABI?
AMD supports openGL just fine (Score:2)
Re: (Score:3)
AMD supports openGL just fine, but they aren't gracefully failing sloppy programming. The Nvidia driver tends to try and make "something you probably sort of meant anyway" out of your illegal openGL instruction and AMD fails you hard with an error message. That's no reason to blame the manufacturer.
Nvidia is hewing to the following:
Robustness principle [wikipedia.org]
In computing, the robustness principle is a general design guideline for software:
Be conservative in what you do, be liberal in what you accept from others (often reworded as "Be conservative in what you send, be liberal in what you accept").
The principle is also known as Postel's law, after Internet pioneer Jon Postel, who wrote in an early specification of the Transmission Control Protocol
Its generally a good idea in when i
Re: (Score:2)
That's interesting. Coding as a kid, I more or less came up with the same principle for my little programs. I also later figured that it was misguided to leave robustness up to the implementation, instead of the specification (or in my case the function definition).
API functions that have any reasonable expectations for default values should just define those defaults, not silently default to something seemly random and completely undocumented.
Re: (Score:2)
It's a garbage principle that makes a mess of the ecosystem, because then you have each implementation making different decisions on just how much slop you allow, resulting in programs that work differently on different systems. It's better to have hard errors.
Re: (Score:2)
While I agree that the principle can result in a mess if misapplied, my interpretation has always been that "be liberal in what you accept" only means that you should avoid defining rigid input formats full of arbitrary rules. If there are a number of different ways to say the same thing, such that it's still clear what was meant, accept them all as equivalent. Allow independent modifiers to be written in any order; don't put artificial restrictions on which characters can be used in a name, or the specific
Re: (Score:2)
While I agree that the principle can result in a mess if misapplied, my interpretation has always been that "be liberal in what you accept" only means that you should avoid defining rigid input formats full of arbitrary rules.
If you read the Wikipedia article [wikipedia.org], you'll see that it came about as advice for implementing the TCP protocol.
Re: (Score:2)
If you read the Wikipedia article, you'll see that it came about as advice for implementing the TCP protocol.
Yes, and I did say that it can result in a mess if misapplied. The right time to consider what to accept and how to interpret it would have been when writing the TCP standard, not at the implementation stage, but we can't always have what we want. It's still a good idea to be liberal in what you accept, perhaps especially so when the standard is lacking in detail, since you never know just how the sender might interpret a particular section. You need to make your software work with all other reasonable impl
Re: (Score:2)
Yes, and I did say that it can result in a mess if misapplied.
You can't tell me it's being misapplied when the origin applies it in exactly that manner. You misunderstood, that is all.
Re: (Score:3, Insightful)
Re: (Score:3)
Just by coincidence, a lot of Nvidia engineers were "inherited" from SGI.
Re: (Score:1)
ps4 also uses opengl.
Re: (Score:3)
No, it doesn't. Can we stop with this myth? The only main console to have supported OpenGL to some degree was the PS3 with the very slow PSGL (OpenGL ES 1.0 + Nvidia Cg shaders + proprietary extensions) that only a handful of indie PSN titles ever bothered to use for easy porting.
Re: (Score:3)
Frankly I think that ATi has made a huge engineering mistake by only focusing on Win32 and by not supporting Unix from day one as a first class citizen,
Yeah, how stupid of them to focus on a platform that has 90%+ of the market. Clearly it would have been a better decision to dump all their resources into a niche platform.
And looking in the mirror.. (Score:1, Flamebait)
"Since AMD debuted its Mantle libraries there's been allegations that they unfairly disadvantaged NVIDIA users or prevented developers from optimizing code."
Get the idea?
Re: (Score:1)
Differnet perspective (Score:5, Insightful)
AMD's perspective is that Mantle is less problematic:
- Mantle's spec are open.
- Also it's just a very thin layer above the bare hardware. Actual problems will mostly be confined in the actual game engine.
- Game engine code is still completely at the hand of the developer and any bug or short coming is fixable.
Whereas, regarding GameWorks:
- It's a closed-source blackbox
- It's a huge midleware, i.e.: part of the engine itself.
- The part of the engine that is GameWorks is closed and if there are any problems (like not following standard and stalling the pipeline) no way that a developer will notice and be able to fix, even as AMD are willing to help. Whereas Nivida could be fixing this by patching around the problem in the driver (as usual), because they control the stack.
So from their point of view and given their philosophies, GameWorks is really destructive, both to them and to the whole market in general (gameworks is as much problematic to ATI, as it is to Intel [even if it is a smaller player] and to the huge diverse ecosystem of 3D chips in smartphone and tablets).
Now, shift the perspective to Nvidia.
First they are the dominant player (AMD is much smaller, even if they are the only other one worth considering).
So most of the people are going to heavily optimise game to their hardware, and then maybe provide an alternate "also ran" back-end for mantle. (Just like in the old days of Glide / OpenGL / DX backends).
What does Mantle bring to the table? Better driver performance? Well... Nvidia has been into the driver optimisation business *FOR AGES*, and they are already very good at it. What is the more likely, that in case of performance problems developers are going to jump on mass to a newer API that is only available from one non-dominant PC player, and a few consoles, and completely missing on any other platform? Or that Nidia will patch around the per problem by hacking their own platform, and dev will continue to use the ?
In Nvidia's perspective and way to work, Mantle is completely irrelevant, barely registering a "blip" on the marketing-radar.
that's why there's some outcry against GameWorks, whereas the most Mantle has managed to attract is a "meh". (and will mostly be considered as yet another wanabe-API that's going to die in the mid- to long-term)
Re: (Score:1)
Nvidia has been into the driver optimisation business *FOR AGES*, and they are already very good at it.
So good they've been killing their own cards for years now.
2010 http://www.zdnet.com/blog/hard... [zdnet.com]
2011 http://forums.guru3d.com/showt... [guru3d.com]
2013 http://modcrash.com/nvidia-dis... [modcrash.com]
This has never happened once to AMD cards, because they're more conservative with their optimizations. NV isn't even the price/performance leader and rarely is. So you get to spend more, and they optimize the crap out of your drivers and card until they break it.
They're almost averaging once a year in killing cards. No thanks. While both
Why AMD doesn't deserve sympathy (Score:2)
Re: (Score:2, Interesting)
It is pretty obvious that AMD/ATi has always favored Windows/Microsoft and has put minimal effort into supporting Unix based platforms.
The same is true of nVidia, the definition of "minimal" over there is simply greater than it is at AMD. nVidia is well known to have aimed their cards directly at D3D support and filled in the gaps in [slower] software for OpenGL in the past. The difference is either in where they threw up their hands and said fuck it, or simply in the area of competence. They, too, put more of their effort into development for Windows. But they also manage to put together a working Linux driver. As you say, ATI can't even
NVIDIA has better experience (Score:1)
Avoi9ding to answer (Score:5, Informative)
Nvidia PAYS for removal of features that work better on AMD
http://www.bit-tech.net/news/h... [bit-tech.net]
Nvidia pays for insertion of USELESS features that work faster on their hardware
http://techreport.com/review/2... [techreport.com]
Nvidia cripples their own middleware to disadwantage competitors
http://arstechnica.com/gaming/... [arstechnica.com]
Intel did the same, but FTC put a stop to it
http://www.osnews.com/story/22... [osnews.com]
so how exactly is that not Nvidias doing??
Nvidia is evil and plays dirty. They dont want your games to be good, they want them to be fast on Nvidia, any means necessary. They use "means to be played" program to lure developers in, pay them off and hijack their games to further nvidias goal.
For example how come Watch Dogs, a console title build from the grounds up with AMD GPU/CPU optimizations to run good on both current gen consoles, is crippled on PC when played on AMD hardware? How does this shit happen?
This is something FTC should weight in just like in Intels case.
Re: (Score:2)
http://www.bit-tech.net/news/h... [bit-tech.net] - What? It's just story speculation here
http://techreport.com/review/2... [techreport.com] - the article doesn't state that Nvidia pays anyone, it's a statement you made up yourself.
At this point I decided not to waste anymore of my time after looking up the first 2 links
Re: (Score:2)
http://la.nvidia.com/object/nz... [nvidia.com]
"The Way It's Meant to be Played"
Nvidia pays you shitload of money for participating in this program, and can additionally guarantee certain sale goals (by bundling your product with their GPUs).
In order to participate you only have to do two things, insert nvidia ad clip at the start of the game, and let nvidia rape your codebase.
On paper Nvidia pays you for joint marketing campaign, but deep down in the paperwork you are letting them decide what your codebase will look lik
Re: (Score:3, Informative)
Nvidia PAYS for removal of features that work better on AMD
http://www.bit-tech.net/news/h... [bit-tech.net]
Reading the link you posted above, it seems like a bit of a non-factual load of waffle. Nvidia deny paying, Ubisoft deny being paid, and the only sources mentioned are anonymous speculators we have no way of knowing are not just a few paid ATI shills.
Nvidia pays for insertion of USELESS features that work faster on their hardware
http://techreport.com/review/2... [techreport.com]
Wow, another example of amazing journalism here.
Some guy moaning about Crysis having loads of detailing that is only used in the DirectX11 game. He give loads of examples of this, then posts a summary page of wild speculation with no sources quoted other than h
Misconception in the OP (Score:1)
AMD made about $1.4 billion off the Radeon division. For the same period, NVIDIA made more than $4.2 billion. Some of that was Tegra-related and it's a testament to AMD's hardware engineering that it competes effectively with Nvidia with a much smaller revenue share, but it also means that Team Green has far more money to spend on optimizing every aspect of the driver stack.'"
While that's true for revenue, the difference in profits between AMD and NV are very close.
Re: (Score:2)
How is -$83million (AMD) close to $581million (nVidia)?