Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Graphics Games

AMD, NVIDIA, and Developers Weigh In On GameWorks Controversy 80

Dputiger writes: "Since NVIDIA debuted its GameWorks libraries there's been allegations that they unfairly disadvantaged AMD users or prevented developers from optimizing code. We've taken these questions to developers themselves and asked them to weigh in on how games get optimized, why NVIDIA built this program, and whether its an attempt to harm AMD customers. 'The first thing to understand about [developer/GPU manufacturer] relations is that the process of game optimization is nuanced and complex. The reason AMD and NVIDIA are taking different positions on this topic isn't because one of them is lying, it’s because AMD genuinely tends to focus more on helping developers optimize their own engines, while NVIDIA puts more effort into performing tasks in-driver. This is a difference of degree — AMD absolutely can perform its own driver-side optimization and NVIDIA's Tony Tamasi acknowledged on the phone that there are some bugs that can only be fixed by looking at the source. ... Some of this difference in approach is cultural but much of it is driven by necessity. In 2012 (the last year before AMD's graphics revenue was rolled into the console business), AMD made about $1.4 billion off the Radeon division. For the same period, NVIDIA made more than $4.2 billion. Some of that was Tegra-related and it's a testament to AMD's hardware engineering that it competes effectively with Nvidia with a much smaller revenue share, but it also means that Team Green has far more money to spend on optimizing every aspect of the driver stack.'"
This discussion has been archived. No new comments can be posted.

AMD, NVIDIA, and Developers Weigh In On GameWorks Controversy

Comments Filter:
  • by Charliemopps ( 1157495 ) on Wednesday June 04, 2014 @12:27AM (#47162039)

    ...but nvidia offers far better drivers and some extra features like physx

    It's more than that. NVIDIAs drivers aren't even that good. It's just that ATI's (AMDs) are just so terrible that they look good in comparison. Who the hell decided the catalyst control center was a good idea? It reminds me of some glitchy 1990s spam ladened chat program. What a joke. The drivers are so sketchy almost every game I'd play would have "STICKY: For ATI users check here first!" at the top of their support forums. Trying to get hardware acceleration to work on my linux media PC was almost impossible until I switch to NVIDIA. Stop creating new cards I can cook and egg on and fix your damned drivers. I have enough fried eggs I just want to watch a movie without spending 30min dinking around with arcane driver settings while my wife keeps asking me why we canceled cable.

  • by Anonymous Coward on Wednesday June 04, 2014 @05:23AM (#47162715)

    AMD's drivers tend to explicitly follow the OpenGL standards. To a fault.

    That is a popular excuse, especially for the open source drivers that frequently have problems with newer commercial games, but having more complete support for what is in the standard and being more permissive to what is not are not mutually exclusive. For example, see this page for some actual conformance testing results: http://www.g-truc.net/post-0655.html#menu As you can see, the Nvidia binary driver clearly passes a higher percentage of the tests than any of the others, and it is the only driver to pass all samples from OpenGL 3.3 to 4.4.

    From a consumer's point of view, it is also a poor attitude from Mesa developers to interpret "implementation defined behavior" as "license to break anything as we see fit" (GCC developers tend to do the same, by the way, even though the compiler has its own set of non-standard extensions as well). They are free to add a configuration option that lets the user choose between strict conformance (mainly for developers testing their code) and maximum compatibility, but the casual consumers will not care why the game they paid for fails to work, if it keeps happening, they will ignore the excuses and just delete Linux and go back to Windows/Direct3D.

  • Avoi9ding to answer (Score:5, Informative)

    by citizenr ( 871508 ) on Wednesday June 04, 2014 @07:12AM (#47163041) Homepage

    Nvidia PAYS for removal of features that work better on AMD

    http://www.bit-tech.net/news/h... [bit-tech.net]

    Nvidia pays for insertion of USELESS features that work faster on their hardware

    http://techreport.com/review/2... [techreport.com]

    Nvidia cripples their own middleware to disadwantage competitors

    http://arstechnica.com/gaming/... [arstechnica.com]

    Intel did the same, but FTC put a stop to it
    http://www.osnews.com/story/22... [osnews.com]

    so how exactly is that not Nvidias doing??

    Nvidia is evil and plays dirty. They dont want your games to be good, they want them to be fast on Nvidia, any means necessary. They use "means to be played" program to lure developers in, pay them off and hijack their games to further nvidias goal.

    For example how come Watch Dogs, a console title build from the grounds up with AMD GPU/CPU optimizations to run good on both current gen consoles, is crippled on PC when played on AMD hardware? How does this shit happen?

    This is something FTC should weight in just like in Intels case.

  • by Ash Vince ( 602485 ) * on Wednesday June 04, 2014 @08:31AM (#47163373) Journal

    Nvidia PAYS for removal of features that work better on AMD

    http://www.bit-tech.net/news/h... [bit-tech.net]

    Reading the link you posted above, it seems like a bit of a non-factual load of waffle. Nvidia deny paying, Ubisoft deny being paid, and the only sources mentioned are anonymous speculators we have no way of knowing are not just a few paid ATI shills.

    Nvidia pays for insertion of USELESS features that work faster on their hardware

    http://techreport.com/review/2... [techreport.com]

    Wow, another example of amazing journalism here.

    Some guy moaning about Crysis having loads of detailing that is only used in the DirectX11 game. He give loads of examples of this, then posts a summary page of wild speculation with no sources quoted other than his own imagination. He never asks any of the companies involved, he just posts a bunch of stuff about why this might be the case.

    I have another possible suggestion as to why this was the case: Crytek like making stuff look overly detailed and include graphics detailing that means their games continue to max out graphics cards long after they are released. They always make they games playable on the budget cards if you crank the detailing down, but they also like catering to people who buy a new graphics card then go back and play a few oldies that they had to crank the detail down on previously. Crytek also probably also quite like their games being used in hardware reviews because their games hammer the hardware.

    Nvidia cripples their own middleware to disadwantage competitors

    http://arstechnica.com/gaming/... [arstechnica.com]

    Ok, congratulations on actually posting an article that was real journalism, with quote sources and not just made up of the authors own conjecture.

    The issue here though seems to be that there was an optimisation, moving from x87 to SSE that they did not do on a bunch of legacy code. Instead they rewrote it from scratch, which took slightly longer to use SSE.

    This was not them intentionally doing something to hobble a competitor, this was them not doing anything to help them quickly. That is very different.

    They did however ultimately fix it:

    "PhysX SDK 3.0 was released in May 2011 and represented a significant rewrite of the SDK, bringing improvements such as more efficient multithreading and a unified code base for all supported platforms"

    Intel did the same, but FTC put a stop to it
    http://www.osnews.com/story/22... [osnews.com]

    There is a massive difference here, Intel's were intentionally hobbling the code their complier created based on finding a competing vendor name in the product string. They did not say "wait for version 3" like the PhysX case, they just did something then just sat their tight lipped until it went to court and they were forced to change it.

    This is something FTC should weight in just like in Intels case.

    As I said earlier, Nvidia made the all important change to use SSE when running PhysX on the CPU without the FTC being involved.

  • Sounds to me like you are using a 1990's card too, AFAIK "catalyst" is no longer supported and it's certainly not bundled with recent cards.

    Not only is CCC still a thing, a bug-ridden piece of shit thing which can cause systems to crater and which amounts to 150MB for a preferences GUI, but ATI abandons cards much, much more quickly than does nVidia. Indeed, when I bought my last ATI-graphics product new in the store (so old it's a R690M-based subnotebook) it was all of the following things:

    • Never getting another/newer official graphics driver for any OS
    • Unsupported by fglrx as being "too old" and unsupported by the OSS ati driver as being "too new"

    That right, it was not just obsoleted but abandoned while it was still being sold.

    The nvidia driver is enormous because one download supports cards practically back to the time of Methuselah. It hasn't really been that long since they finally abandoned support for literally their oldest cards. AMD abandons them while they're still on store shelves. I don't care if it's because they're spread too thin, or just because they're assholes, or because the heavens conspire against them. It just doesn't make sense to use their graphics cards. You seem to have noticed this, as you have an nVidia card.

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...