Please create an account to participate in the Slashdot moderation system


Forgot your password?
AMD Graphics Games

Game Devs Only Use PhysX For the Money, Says AMD 225

arcticstoat writes "AMD has just aimed a shot at Nvidia's PhysX technology, saying that most game developers only implement GPU-accelerated PhysX for the money. AMD's Richard Huddy explained that 'Nvidia creates a marketing deal with a title, and then as part of that marketing deal, they have the right to go in and implement PhysX in the game.' However, he adds that 'the problem with that is obviously that the game developer doesn't actually want it. They're not doing it because they want it; they're doing it because they're paid to do it. So we have a rather artificial situation at the moment where you see PhysX in games, but it isn't because the game developer wants it in there.' AMD is pushing open standards such as OpenCL and DirectCompute as alternatives to PhysX, as these APIs can run on both AMD and Nvidia GPUs. AMD also announced today that it will be giving away free versions of Pixelux's DMM2 physics engine, which now includes Bullet Physics, to some game developers."
This discussion has been archived. No new comments can be posted.

Game Devs Only Use PhysX For the Money, Says AMD

Comments Filter:
  • Maybe (Score:4, Interesting)

    by Hatta ( 162192 ) on Monday March 08, 2010 @02:18PM (#31403036) Journal

    I wouldn't be surprised if most game devs wouldn't implement PhysX if not for a subsidy. Only half the market is going to be able to take advantage of it after all. It may not be that they don't want it, just that it's not an economical use of their time otherwise.

  • Re:Maybe (Score:2, Interesting)

    by Vorknkx ( 929512 ) on Monday March 08, 2010 @02:21PM (#31403070)
    Exactly. Havok and in-house physics engine are perfectly fine for physics simulations in games. I don't see why we need another third-party physics engine. Flying boxes and wood splinters do not make a better game.
  • by Anonymous Coward on Monday March 08, 2010 @02:27PM (#31403172)

    PhysX adds nothing to the game play. It's just stupid clutter on the ground... At least in any games I've played that use it. As an Nvidia user, PhysX is no longer a reason to keep with the brand... at least now that I have used it.

    That said, it's no surprise to me that game developers wouldn't support it without incentive

  • Re:Maybe (Score:3, Interesting)

    by Monkeedude1212 ( 1560403 ) on Monday March 08, 2010 @02:31PM (#31403222) Journal

    Flying boxes and wood splinters do not make a better game.

    Well - it's the little things that make the differences though. I mean, you wouldn't think that flying boxes and wood splinters don't make a game any more amazing, but those were basically THE core elements of the Force Unleashed, using the Havok engine. Not surprisingly though, Havok was strictly licensed to Lucasarts for all of 2009 - no one else could use it. It's only just recently become available. So - for most of 2009, PhysX was the best choice - not only subsidized for using it, but because its competitors weren't actually available.

  • by DarkkOne ( 741046 ) on Monday March 08, 2010 @02:34PM (#31403268) Homepage Journal
    Even before hardware accelerated PhysX was on CUDA and you only got it with the standalone card, I always thought PhysX looked a bet nicer than Havok in action. I've been wishing more games used PhysX for a while, but it seems that if a game is going to be cross-target to the consoles as well, Havok is just a lot more likely. It may just be my own perceptions, but things seem to have a bit more consistent behaviour in regard to momentum and mass in PhysX whereas Havok seems a bit "floaty" a lot of the time. This may just be a result of constants designers pick, or something, I don't really know the details. But I personally just like PhysX better, from a player standpoint, hardware accelerated or not.
  • by Fwipp ( 1473271 ) on Monday March 08, 2010 @02:44PM (#31403408)

    Did you miss the part where OpenCL and DirectCompute run on both NVIDIA and AMD graphics cards, and that AMD is promoting an open industry standard instead of a proprietary vendor-specific API?

    Because I know it was really obvious, and sort of the entire point of the article, but it really sounds like you did.

  • by NeutronCowboy ( 896098 ) on Monday March 08, 2010 @02:46PM (#31403438)

    Look at the alternative: instead of adding useless physics to a game that doesn't need it, they could be adding advertisements. Advertising dollars are dollars nonetheless, and I very much prefer a quick splash screen of "powered by PhysX" and some mindless physics interactions than an in-game billboard (possibly even updated over the Internet, shudder).

  • by BatGnat ( 1568391 ) on Monday March 08, 2010 @02:52PM (#31403502)
    Just because some companies use PhysX for pretty effects only, does not mean that someone else wont come along and use it for something cool that will add something to gameplay...
  • by mpapet ( 761907 ) on Monday March 08, 2010 @03:04PM (#31403680) Homepage

    This kind of incentive is anti-competitive.

    1. It eliminates competition by feature/functionality.
    2. It meaningfully constrains innovation. A novel product without capitalization to participate is shut out. (That's the goal anyway)

    That said, this kind of incentivizing is everywhere. (game consoles, mega-retailers, mobile phones) No one seems to care about the increased costs consumers assume or constraint on innovation.

    I have my bias, what is yours?

  • by Z34107 ( 925136 ) on Monday March 08, 2010 @03:19PM (#31403882)

    It's a difference in scale over Havok. I haven't had much time to play video games lately, but I saw a particularly nifty shot from Arkham Asylum. Shoot a bookshelf without PhysX and it falls over. Shoot it with PhysX and suddenly every individual page from every book flies through the air, each tracing its own path down from the sky.

    So, you can do physics in Havok. But not on that scale.

    I'd suspect that it's not being used for anything other than "ground clutter" is because you can't design your game around PhysX - not everyone has an NVIDIA card. So, PhysX has to be optional and can't change gameplay - which pretty much relegates it to ground clutter.

  • by KillShill ( 877105 ) on Monday March 08, 2010 @03:32PM (#31404032)

    Nvidia is very anti-competitive and has been for a very long time.

    The recent "making physx stop working when AMD gfx card is present" is just one of the more public outings of their unethical behavior.

    I wish someone would expose all of their shenanigans and anti-competitive practices so people can realize how badly these things affect the industry and consumers (ugh, hate that word).

    The most recent thing I read about their practices is from the upcoming PC game, Just Cause 2. There's a trailer showing off Nvidia-only effects ...(something which is dead standard DirectX code) and artificially blocking out AMD/others from getting the benefits. The Batman Arkham Asylam scandal was one more people may recall. They claim (and their users/shills) that TWIMTBP is just "marketing"... more like bribery and blocking out the competition. They've been caught on many occasions but the public rarely sees anything negative about them.

    Nvidia is the Intel/Microsoft of the video card industry but unlike them, isn't quite as dominant (thankfully for us) but they still do a hell of a lot of damage. (The Jupiter of the computer industry... too small to become a sun but still an 800 quadrillion ton gorilla).

    I've stopped buying Nvidia cards since the Geforce 2. At that time for performance reasons but since then I vote with my wallet and let others know to support fair and legal competition.

  • by SharpFang ( 651121 ) on Monday March 08, 2010 @03:33PM (#31404038) Homepage Journal

    A friend told me about his experience with Utopia. It implemented GPU-accelerated physics in one of recent patches. But try hard as you wish, he failed to notice any difference for weeks of gameplay. Until he entered the central city. With flags by the entrance fluttering smoothly in the wind, instead of the old static animation.

    Yep, that's it. Many megabytes of a patch, a game of hundreds of miles of terrain, hundreds of locations, battles, vehicles, all that stuff... and physics acceleration is used to flutter flags by the entrance.

  • by obarthelemy ( 160321 ) on Monday March 08, 2010 @03:39PM (#31404110)

    There are about 25 million PCs sold per month. I guess ATI is happy to have sold 8% of that monthly amount over the several months their 5xxx have been available, that's 3-4% of PC sales. Congrats to them, but still, fairly marginal.

    Discrete cards have always been better than IGPs. I don't really get your point. Only recently (definitely way after the pentium 1) have IGPs become good enough to display all video files, or handle Aero.

    PhysX is about making physics computations, not directly putting pixels on screen, so it's a kind of specialized GPGPU.

  • by afidel ( 530433 ) on Monday March 08, 2010 @03:56PM (#31404370)
    In my experience if you try to push PhysX that hard on a midlevel GPU or lower it trashes the FPS to unplayable. In fact I disabled GPU based PhysX and had the driver fallback to CPU rendering because in most games I have spare cores but the GPU is already being pushed to the max. With a quad core CPU costing almost nothing extra over a dualcore (on desktops at least) and decent GPU's starting at $100 and quickly going up from there I think that's probably the norm for the vast majority of gamers.
  • by hughJ ( 1343331 ) on Monday March 08, 2010 @04:09PM (#31404566)
    Using a game like Arkham Asylum as a comparison between physx and havok is silly. You've got one physics path that's been specially developed to showcase the strengths of a particular brand of hardware, while the other is left as a compatibility fallback for everything else. That's not an apples to apples comparison. With that same reasoning one could say that ATI has worse anti aliasing than nvidia, simply because the game shipped without support for it for ATI.
  • by Xrikcus ( 207545 ) on Monday March 08, 2010 @05:20PM (#31405492)

    Unfortunately from the physics simulations I've been working with I'm pretty sure that what they've done there is simply removed physics activity from the non-accelerated version rather than adding it to the accelerated one. A sneaky way of making GPU-accelerated PhysX look better. I'd be shocked if those book effects wouldn't be just as easy at the same framerate on the CPU unless there are truly ridiculous numbers of books.

  • by icebraining ( 1313345 ) on Monday March 08, 2010 @06:19PM (#31406484) Homepage

    Yes. How will they know how what adverts to show? By tracking web usage, surveys, game usage metrics and other info they get by spying on you.
    Also it'll probably add delays while the ads are downloaded.

    They won't track anything more than game usage because the game can't access anything else, on my PC. And they can *already* be tracking game usage. Of course, if I know it is, I won't buy it. And if it's an SP game, I'll just block it at firewall level.

    As for delays in game loading, that's the kind of stuff that I can easily find out after reading a couple of reviews. Buying a game without doing so first is stupid anyway.

    As I said, if it's unobtrusive and doesn't shut then game down if it can't download the ads, fine by me.

  • by Roger Wilcox ( 776904 ) on Monday March 08, 2010 @07:12PM (#31407312)
    I have to disable PhysX in the nVidia control panel to get HL2 or any of the Source engine games to run properly! I had no idea what was causing these games to crash. After disabling PhysX they work right every time!

    Apparently it doesn't do anything crucial or even noticable as my games run just fine with it turned off. And now I'm told the game devs don't even want to use it?

    This "feature" has caused me nothing but grief!

The best defense against logic is ignorance.