Game Devs Only Use PhysX For the Money, Says AMD 225
arcticstoat writes "AMD has just aimed a shot at Nvidia's PhysX technology, saying that most game developers only implement GPU-accelerated PhysX for the money. AMD's Richard Huddy explained that 'Nvidia creates a marketing deal with a title, and then as part of that marketing deal, they have the right to go in and implement PhysX in the game.' However, he adds that 'the problem with that is obviously that the game developer doesn't actually want it. They're not doing it because they want it; they're doing it because they're paid to do it. So we have a rather artificial situation at the moment where you see PhysX in games, but it isn't because the game developer wants it in there.' AMD is pushing open standards such as OpenCL and DirectCompute as alternatives to PhysX, as these APIs can run on both AMD and Nvidia GPUs. AMD also announced today that it will be giving away free versions of Pixelux's DMM2 physics engine, which now includes Bullet Physics, to some game developers."
They wish they'd thought of it first (Score:4, Informative)
Sounds to me like AMD just wishes they'd thought of it first. There's no reason AMD couldn't offer similar deals.
Re: (Score:3, Insightful)
Especially if it causes games to be less enjoyable on other hardware platforms. I could see a real problem with this in terms of anti-trust
Re: (Score:3, Interesting)
Look at the alternative: instead of adding useless physics to a game that doesn't need it, they could be adding advertisements. Advertising dollars are dollars nonetheless, and I very much prefer a quick splash screen of "powered by PhysX" and some mindless physics interactions than an in-game billboard (possibly even updated over the Internet, shudder).
Re: (Score:2)
Re: (Score:2)
Games like GTA (at least San Andreas) already have billboards, would it be so bad if they had real companies instead of fakes? I wouldn't care.
In fact, I modded some of those to look like real ones, using The Sopranos wallpapers :)
Re: (Score:2)
Re: (Score:3, Interesting)
They won't track anything more than game usage because the game can't access anything else, on my PC. And they can *already* be tracking game usage. Of course, if I know it is, I won't buy it. And if it's an SP game, I'll just block it at firewall level.
As for delays in game loading, that's the kind of stuff
Re: (Score:2)
I very much prefer a quick splash screen of "powered by PhysX" and some mindless physics interactions than an in-game billboard
How about an in-game banner that realistically flaps in to the wind with the power of PhysX?
Re:They wish they'd thought of it first (Score:4, Insightful)
The payment could just mitigate the risk associated with bearing the extra cost of adding PhysX to a game when not all of the market can utilize it and there is limited experience with it in the developer community. That doesn't mean its bad for the industry, or bad for the quality of the game.
Really? Can you point to any provision of anti-trust law that this would violate?
Re: (Score:2)
Really? Can you point to any provision of anti-trust law that this would violate?
Exactly - that's like saying that there's an anti-trust suit just because Modern Warfare 2 looks better on PS3 than on the 360 or because it looks better on a system running a 295 GTX than on a system running a 9800 GT.
Re: (Score:2)
First, no one is forcing companies to do anything. Most companies DON'T use PhysX. Secondly, PhysX is only used in GAMES, hardly a "necessity" (and yes, I'm a gamer). Thirdly, if you think this is "of questionable legality", then I'm assuming that you think Apple should be sued to hell and back for locking the iPhone / iPod to iTunes, right? Or the fact that I can't use my AMD processor with an Intel chipset, that should be grounds for suing too, right?
You're also forgetting that no game the uses PhysX
Its all Hearsay (Score:5, Insightful)
We don't have any proof that developers don't want PhysX. What we have is spokes person from company A saying that no one wants company B's technology. There are no scientifically obtained statistics only one guy's - a competitor - opinion.
Nor did the article state *why* it may be unwanted, or any specific why-nots for using PhysX
Re: (Score:3, Informative)
Re: (Score:2)
It won't be. For one, NVIDIA is not a monopoly, and neither are the game publishers, so there are alternatives in both graphics hardware and in games. For two, "they are not allowing the creative process to organically produce the game through voluntary deals" is not the same as "they are causing the games to be worse". You would first need a reason why their of
Re: (Score:2)
But they shouldn't have to, that's the thing. Microsoft and Intel have already gotten in trouble for offering similar deals to OEMs to favor their products (the latter also in detriment of AMD), so it's not unconceivable that NVidia could be hit with a similar problem as well, as they also hold a significant share of the market in question and this could very well be interpreted as a move designed to keep a hold of it by unfair means.
Re: (Score:2, Interesting)
Did you miss the part where OpenCL and DirectCompute run on both NVIDIA and AMD graphics cards, and that AMD is promoting an open industry standard instead of a proprietary vendor-specific API?
Because I know it was really obvious, and sort of the entire point of the article, but it really sounds like you did.
Re:They wish they'd thought of it first (Score:4, Informative)
Suggesting that OpenCL and DirectCompute are alternatives to PhysX is analgous to saying that OpenGL is an alternative to Unreal Engine.
The basic reality here is that four years ago NVIDIA decided invest a lot of money in making GPUs more general purpose, to apply them to more problems than just 3D rendering. ATI didn't care and just focused on making the fastest 3D card possible. Today there are alternatives to NVIDIA's technology, most notably OpenCL... but it's worth remembering that OpenCL is very strongly derived from CUDA. In fact, most of the OpenCL spec looks like they ripped it out of the CUDA spec and changed the function calls from cudaSomething to clSomething.
So yes, open standards are good. But in this case it does smell strongly of sour grapes. ATI made several bad business decisions and have been left playing catchup.
Re: (Score:2)
Actually, cuSomething to clSomething. OpenCL is basically a copy of the the CUDA driver API, which prefixes its functions with cu. The runtime API (a higher level API) is what uses the cuda prefix.
Re: (Score:2)
Mod parent up! Why did the summary compare two API's (OpenCL and Direct Compute) to a physics engine. Also, wasn't there an article a little while back that announced Nvidia is porting physx to OpenCL?
Any why is AMD promoting Direct Compute as an open standard? Direct Compute is Microsoft's GP-GPU API (like CUDA and OpenCL) but is part of DirectX which does not run on MAC or Linux etc. AMD should promote OpenCL over Direct Compute for game and application use.
Re: (Score:2)
There's nothing PhysX actually adds to the experience of a game that Havok (physics done in-CPU) doesn't. Sure, PhysX can handle more intense simulations, but I don't see how it could improve, say, Half-Life 2's implementation of Havok.
Re:They wish they'd thought of it first (Score:4, Interesting)
It's a difference in scale over Havok. I haven't had much time to play video games lately, but I saw a particularly nifty shot from Arkham Asylum. Shoot a bookshelf without PhysX and it falls over. Shoot it with PhysX and suddenly every individual page from every book flies through the air, each tracing its own path down from the sky.
So, you can do physics in Havok. But not on that scale.
I'd suspect that it's not being used for anything other than "ground clutter" is because you can't design your game around PhysX - not everyone has an NVIDIA card. So, PhysX has to be optional and can't change gameplay - which pretty much relegates it to ground clutter.
Re: (Score:3, Interesting)
Re: (Score:2)
In fact I disabled GPU based PhysX and had the driver fallback to CPU rendering because in most games I have spare cores but the GPU is already being pushed to the max.
PhysX doesn't use multiple cores very effectively, from what I understand. This isn't surprising - NVidia want you to use one of their GPUs, not CPU power.
Re: (Score:2)
"So, you can do physics in Havok. But not on that scale."
We didn't need physx acceleration to do that - we demoed that kind of fun stuff in pure software using 32-bit system vs 64-bit system several years ago - Far Cry, anybody?
Re: (Score:3, Interesting)
Re: (Score:2)
Very true - I was thinking of Valve's "Source" engine when I read "Havok." My brain had too little coffee for real-time posting simulations.
Re:They wish they'd thought of it first (Score:4, Interesting)
Unfortunately from the physics simulations I've been working with I'm pretty sure that what they've done there is simply removed physics activity from the non-accelerated version rather than adding it to the accelerated one. A sneaky way of making GPU-accelerated PhysX look better. I'd be shocked if those book effects wouldn't be just as easy at the same framerate on the CPU unless there are truly ridiculous numbers of books.
Re: (Score:2, Interesting)
Re: (Score:2)
Spoken like someone who has never had to
a) animate + draw 2D sprites doing:
- neutral pose
- run pose
- neutral with weapon
- neutral with shield
- running with weapon
- running with shield
for ALL THE FRAMES. I haven't even m mentioned the other million permutations of the avatar+enemies in states such as poisoned, etc,
OR
b) programmed a graphics engine that has had to light said 2d sprites.
3D "won" because of it scaled up content creation. i.e. The convenience of animating, texturing, lighting and shading blows
Re: (Score:2, Funny)
Maybe (Score:4, Interesting)
I wouldn't be surprised if most game devs wouldn't implement PhysX if not for a subsidy. Only half the market is going to be able to take advantage of it after all. It may not be that they don't want it, just that it's not an economical use of their time otherwise.
Re: (Score:2, Interesting)
Re: (Score:3, Interesting)
Flying boxes and wood splinters do not make a better game.
Well - it's the little things that make the differences though. I mean, you wouldn't think that flying boxes and wood splinters don't make a game any more amazing, but those were basically THE core elements of the Force Unleashed, using the Havok engine. Not surprisingly though, Havok was strictly licensed to Lucasarts for all of 2009 - no one else could use it. It's only just recently become available. So - for most of 2009, PhysX was the best choice - not only subsidized for using it, but because its comp
Re: (Score:2)
I thought Havok has been used by SecondLife for years....
Re:Maybe (Score:4, Funny)
Re: (Score:2)
I wish. All you can get right now is a flat texture. SL could really use a decent fur shader. And better support for quadrupeds and other types of avatars. And flexible tails that can move properly. That's just for a start.
SL still has a long way to go in that respect.
I think you are confuzled (Score:5, Informative)
Intel owns Havok (since 2007) and licenses it out all over the place. There's a page that has all the titles using it (http://www.havok.com/index.php?page=available-games) and it is not a small list. Havok also runs on the CPU exclusively (and will probably continue that way since Intel wants to sell quad cores) so works no matter what your graphics card.
It's also not just physics anymore, there's Havok animation libraries and so on.
Re: (Score:2)
I was mistaken, it was the DMM from Pixelux that was licensed - which AMD is also giving out, according to the article.
Re: (Score:2)
It's only just recently become available.
I haven't been able to confirm your statement about Havok being exclusive to Lucasarts for 2009, but the Havok engine has been around since 2000 and has been used by over 100 different games, so it is by no means just recently available.
Re: (Score:2)
Appears I was mistaken, the DMM by Pixelux and the Euphoria (for AI) was strictly to Lucas Arts, Havok is and was always available. My bad.
little things make a difference only if (Score:2)
they are implemented good. so far what i have seen in all those 'physics' engine games (irrelevant of whose engine is it) has been geometrically constructed splinters and pieces flying around. it takes more away from realism than it delivers, because the visuals they create (ie the distribution and nature of the destruction) is generally unrealistic.
Re:Maybe (Score:5, Insightful)
"Flying boxes and wood splinters do not make a better game."
But dead guys laying 180 perpendicular off a cliff makes them awesome? Does no one here remember the good old days of early FPS where if you died on the edge of a ledge your body would lay flat over the edge? Does no one remember the time when you hit dead bodies with shots and they didn't move or flail around? What about mass effect 1 the anti-gravity at the end with the geth/dead bodies floating and flailing around, not cool at all?
All that is physics and yes the do make a better game WHEN they are applied to things that need them and not over-used, especially not using physics as a gimmick.
Re: (Score:2)
What about mass effect 1 the anti-gravity at the end with the geth/dead bodies floating and flailing around, not cool at all?
I thought that was a feature. Gravity on the Presidium was 0.3G, and you're fighting on the outside of the station, so nailing a geth or krogan with high-powered biotics might be enough to accelerate it to escape velocity.
Re:Maybe (Score:5, Funny)
Does no one remember the time when you hit dead bodies with shots and they didn't move or flail around?
Not everyone includes "pretty" in their "good game" equation. Doom can still hold it's own against modern games in terms of actual fun.
Clearly you don't get a kick out of shooting dead bodies and seeing them twitch.
What the hell's wrong with you?
Re: (Score:2)
Ya know, when you've seen it happen, it looks so fake in a game, no matter how good the engine...
Re: (Score:2)
"Not everyone includes "pretty" in their "good game" equation"
No doubt but most people move on if a game has better graphics, otherwise you would still be playing wolf-3d, why did you move to doom? Oh yes, that's right it that darn graphics thing adds atmosphere and awesomeness to games. Why did people move on from super mario 1 for the NES, etc, etc? Not all games that come after are as fun but if you can get the same old fun in a shiny new package people will play it over the original.
Ever played the o
Re: (Score:2)
You are assuming it was because of the improved graphics, and not because of the improved gameplay mechanics and pacing. Polished turds are still turds and great games will always be great.
Re: (Score:2)
"You are assuming it was because of the improved graphics, and not because of the improved gameplay mechanics and pacing. Polished turds are still turds and great games will always be great."
They will but sadly great gamers like ourselves are in the minority, assassin's creed 1 was like a poor man's prince of persia, but it sold millions. Sadly lots of people have junk gaming tastes, which means many aren't discerning enough to tell the difference between a mediocre game that missed the mark and a good gam
Re: (Score:2)
yes, what fools everyone is, they have no clue what they like at all. don't they know they're supposed to ask you first, and then only like the things that you like?
Re: (Score:2)
"yes, what fools everyone is, they have no clue what they like at all. don't they know they're supposed to ask you first, and then only like the things that you like?"
You're exactly what I'm talking about - first of all AC1 was a third person acrobatic combat game, i.e. cookie cutter and generic, nothing new except new art - that's what makes people buy the game "omg the art is so awesome and the AC1 assasin looks so badass got to buy this!". You could climb walls and jump on rooftops - you could do simila
Re: (Score:2)
Newer games offer more colors and higher resolution, nothing wrong with that. I'm sure the latest Gods of War would be just as fun in 8-bit 320x200.
Re: (Score:2)
You missed my point completely, go play mass effect 1 with the biotic powers, if you do not find the floaty physics of the 'anti gravity" animations of bodies floating around awesome then there is something wrong with you!
Physics is used as a special effect in games, sure games have clipping errors, most do. But these are HARD problems -i.e. they take time and money to develop and would not make for better games, games have limited budgets and they have to decide where to spend that money, people keep forg
Re:Maybe (Score:5, Informative)
Re: (Score:2)
Is PhysX just an API or is there hardware underneath supporting it? If it's hardware, then I'd say PhysX would be the better option practically. i.e. turning PhysX on would essentially be free in terms of resource usage. If it's just software then turning it on would take resources away from the rest of the rendering.
In any case, I've played a few games with PhysX. It's pretty fucking cool. Not cool enough to make a shitty game worth playing, but it makes a good game that much better.
Re: (Score:2)
Everyone seems to be glossing over a nice little fact:
Physx works on -all- modern Windows computers, whether they have a graphics accelerator or not. So yes, only have the market can use the hardware accelerated Physx, but the other half isn't barred from the game. They get to play, too.
Re: (Score:2)
Re:Maybe (Score:5, Funny)
Open standards always win out over closed standards. Like OpenGL -vs- DirectX.... oh... wait... :-P
Re:Maybe (Score:5, Informative)
I've done some work with both PhysX and the things that AMD is pushing for. I try to keep with the Physics Abstraction Layer [adrianboeing.com], which lets me plug in whatever physics engine as the backend, which gives a pretty damn good apples-to-apples performance metric. Personally, my ultimate choice of physics engine is the one which exhibits the best performance. My experience may differ from others, but I generally get the best performance from PhysX on with an nVidia GPU and BulletPhysics with an AMD GPU. Sometimes, the software version of PhysX outstrips the competition, but I have never seen anything beat PhysX in performance with GPU acceleration turned on. And with PAL, it is easy to check if there is GPU support on the machine and swap in the physics engine with the best performance (PAL is awesome).
Here's the thing: GPU-accelerated physics are just plain faster. Why? Because collision detection is a highly parallelizable problem. Guess what hardware we have that can help? The GPU. Another great part of using the GPU is that it frees the CPU to do more random crap (like AI or parsing the horribly slow scripting language).
AMD is working on both BulletPhysics and Havok so they can do GPU acceleration. But I have a feeling that PhysX performance will remain faster for a while: PhysX was designed to natively run on the GPU (technically, a GPU-like device), while these other libraries are not. Furthermore, nVidia has quite a head start in performance tuning, optimization and simple experience. In five years, that shouldn't matter, but I'm just saying that it will take a while.
So here is my message to AMD: If you want people to use your stuff, make something that works and let me test it out in my applications. You've released a demo of Havok with GPU acceleration. PhysX has been and continues to work with GPU acceleration on nVidia GPUs and will frequently outperform the software implementation. I'm all for open alternatives, but in this case, the open alternatives aren't good enough.
Re: (Score:3, Informative)
The problem is, Nvidia bought the company that created the PhysX chip. The then incorporated the capabilities into their GPU and their drivers while refusing to allow a true PhysX card to work with anything except an Nvidia GPU in the system. In this case, Nvidia has done a major Dis-service to everyone as the original PhysX did run on anything that supported it but now it's a closed sorce, Nvidia only app/feature.
Re: (Score:2)
Well most devs would want to use their version because of pride in their work.
Re: (Score:3, Informative)
Only half the market is going to be able to take advantage of it after all.
1) PhysX runs on the CPU if no nVidia GPU is present. A $100 quad-core CPU easily handles it for most games.
2) According to the Steam Survey, nVidia is approximately 66% of the PC gaming market. Two thirds.
It's a new riff on the old joke (Score:2)
"You're so ugly the only way to get the dog to play with you is to tie a steak around your neck."
Says the kid the dog without a dog to play with.
Re: (Score:3, Informative)
Wow, that's so badly edited it's surreal.
This is one of those days where even the "Preview" button doesn't help.
That should read "Says the kid that the dog isn't playing with."
Re: (Score:2)
Re:It's a new riff on the old joke (Score:4, Funny)
Yes, that's something up with which I cannot put.
Re: (Score:2)
If you use Google Translate to translate it back into Korean, then Portuguese, Russian, Welsh and then finally back into English everything becomes much clearer.
"Children play with a puppy."
Re: (Score:2)
For some reason, I got this picture:
http://craphound.com/images/translateservererror.jpg [craphound.com]
Re: (Score:2)
Is it actually allowed to also BE better? (Score:3, Interesting)
Is it true? (Score:2)
It seems a lot of people are kvetching at AMD for this because they're criticising a competitor. I think it's really more relevant to consider if what AMD says is true - if nVidia is paying people to use their proprietary stuff and then claiming it has broad industry adoption (and therefore is good), that's pretty shady.
I'm not sure how we really can tell if the criticism is valid unless we're in the industry though.
clutching at straws (Score:5, Insightful)
GPU makers are in a bind:
- IGP are now enough for 90% of users: office work (even w/ Aero), video, light gaming, dual-screen... all work fine with IGPs
- the remaining 10% (gamers, graphic artists) are dwindling for lack or outstanding games: game publishers are turned off by rampant piracy, mainly online games bring in big money nowadays
- GPGPU is useless except in scientific computing: we already have more x86 cores than the devs know how to use, let alone use a different computing paradigm
- devs have to target the lowest common denominator, which means no GPGPU for games
I'm actually think of moving my home PC to one of the upcoming ARM-based smarttops. They look good enough for torrenting + video watching + web browsing, consume 10 watts instead of 150...
Re:clutching at straws (Score:5, Informative)
IGP are sufficient for 90% of users... but that hasn't changed since back in the Pentium 1 days. Many PCs were equipped with IGP or something that amounted to the same thing but in card form even then.
Also: GPGPU is NOT meant for gfx processing on the fly at all, so it has absolutely nothing to do with devs having to target the lowest common denominator. You even state that its useless except for scientific purposes in your own comment. The entire purpose of the GPGPU move is towards scientific purposes where vast quantities of repeated calcs have to be done. Something that GPUs excel at.
At least get SOME of your facts straight before spouting FUD.
Re:clutching at straws (Score:4, Interesting)
There are about 25 million PCs sold per month. I guess ATI is happy to have sold 8% of that monthly amount over the several months their 5xxx have been available, that's 3-4% of PC sales. Congrats to them, but still, fairly marginal.
Discrete cards have always been better than IGPs. I don't really get your point. Only recently (definitely way after the pentium 1) have IGPs become good enough to display all video files, or handle Aero.
PhysX is about making physics computations, not directly putting pixels on screen, so it's a kind of specialized GPGPU.
Re: (Score:2)
I've got studies that say IGPs actually rose vs GPUs in 2009, I'm interested in your sources.
Indeed, the difference is between IGPs playing some videos, which they always did, and IGPs playing all videos, which they now do. Glad you got that.
Mmm, are you saying that IGPs, especially Intel's much dominant ones, have done hardware-assisted video decode for all formats for a long time ? That's just not true.
And finally, my point exactly: if devs can't do multi-core even now, when do you think they will do phys
Some truth to that. (Score:2)
While I don't think it's super dire, it's certainly a concern. I can add another point. Steam confirmed for Mac [nyud.net].
Problem? Macs don't take the latest and greatest off-the-shelf graphics cards, and generally are a fair bit behind the curve, way back in 'casual land'.
On the other hand, maybe if Apple open up a bit this is a way to sell more and better cards rather than another spike in the coffin.
re: Steam for Mac (Score:2)
Yeah... this is more of a solution than a problem, any way you slice it. Why? Simple ... Many of the games they'll deliver to Mac users via Steam will offer cross-platform network play. So regardless of the specs they're constrained to for a native Mac version of the game, it will help keep a title popular having more people playing it. They can always support higher-res graphics capabilities in the Windows version, if they so desire. And if they do? All the more incentive for Apple to start releasin
Re: (Score:2)
"GPGPU is useless except in scientific computing: we already have more x86 cores than the devs know how to use, let alone use a different computing paradigm"
Well maybe for games but GPGPU will mean a lot for transcoding.
Home HD video is going to be big soon and it takes forever to transcode. However you can do that with an ARM. The TegaII and the OMAP line have enough GPU power to use it for transcoding.
Re: (Score:3, Insightful)
I wonder why you attribute the lack of outstanding games to piracy being rampant -- the industry has been bitching and moaning about that for over 20 years now. That can't be the reason or we would not have a videogame-industry at all.
Few game developers are willing to do risky things though, and countless remakes of the same games just don't really appeal to all that many gamers -- add to that that gaming itself is being transformed (or rather, the marketplace is changing with mobile games becoming a pasti
Is 'Incentivizing' Anti-Competitive? (Score:3, Interesting)
This kind of incentive is anti-competitive.
1. It eliminates competition by feature/functionality.
2. It meaningfully constrains innovation. A novel product without capitalization to participate is shut out. (That's the goal anyway)
That said, this kind of incentivizing is everywhere. (game consoles, mega-retailers, mobile phones) No one seems to care about the increased costs consumers assume or constraint on innovation.
I have my bias, what is yours?
Re:Is 'Incentivizing' Anti-Competitive? (Score:4, Interesting)
Nvidia is very anti-competitive and has been for a very long time.
The recent "making physx stop working when AMD gfx card is present" is just one of the more public outings of their unethical behavior.
I wish someone would expose all of their shenanigans and anti-competitive practices so people can realize how badly these things affect the industry and consumers (ugh, hate that word).
The most recent thing I read about their practices is from the upcoming PC game, Just Cause 2. There's a trailer showing off Nvidia-only effects ...(something which is dead standard DirectX code) and artificially blocking out AMD/others from getting the benefits. The Batman Arkham Asylam scandal was one more people may recall. They claim (and their users/shills) that TWIMTBP is just "marketing"... more like bribery and blocking out the competition. They've been caught on many occasions but the public rarely sees anything negative about them.
Nvidia is the Intel/Microsoft of the video card industry but unlike them, isn't quite as dominant (thankfully for us) but they still do a hell of a lot of damage. (The Jupiter of the computer industry... too small to become a sun but still an 800 quadrillion ton gorilla).
I've stopped buying Nvidia cards since the Geforce 2. At that time for performance reasons but since then I vote with my wallet and let others know to support fair and legal competition.
Best example with the MMORPG UTOPIA (Score:5, Interesting)
A friend told me about his experience with Utopia. It implemented GPU-accelerated physics in one of recent patches. But try hard as you wish, he failed to notice any difference for weeks of gameplay. Until he entered the central city. With flags by the entrance fluttering smoothly in the wind, instead of the old static animation.
Yep, that's it. Many megabytes of a patch, a game of hundreds of miles of terrain, hundreds of locations, battles, vehicles, all that stuff... and physics acceleration is used to flutter flags by the entrance.
My complaint (Score:2)
The original intention of Ageia and their PhysX set up seemed to be just to sell the company, rather than try to make a viable business model of selling hardware. Ageia would have been more open with API and code right from the start if they intended to make a business selling hardware.
Wha? (Score:2)
"They're not doing it because they want it; they're doing it because they're paid to do it."
Doesn't this describe just about any paid project? Just sayin'
Optional Features are expensive (Score:2)
The physics business (Score:2)
Ageia's innovation wasn't their technology. It was their business model. Havok gets a fixed fee per title. Ageia's "physics chip" got revenue for each graphics card. Both Havok and Mathengine had serious revenue problems as standalone companies. The original investors did not do well. Both were eventually acquired. The basic problem is that game middleware isn't a good business.
Physics in the GPU is mostly useful for visual effects like water, snow, fire, explosions, etc., where the motion doesn't f
Duhhh!!! (Score:2)
They're not doing it because they want it; they're doing it because they're paid to do it.
I can say the same thing of just about everyone who is employed, even the folks at AMD. Though, it's only in the "creative" arts where there's always this odd shiny coating of "fidelity" that seems to be desired and added on as a last step. In reality, this coating is as faux as the images and sounds that these arts provide. The bottom line - it's a business, any art is just an afterthought. If they can make more
Open standards, and nothing else (Score:4, Insightful)
i wouldnt even care if physx was the biggest software innovation of the century - in gaming, especially in regard to graphics, we have suffered a lot because of proprietary shit in the last 2 decades. i dont want to see that again. even if its coarse, inadequate at the start, everyone should push for open standards so that we wont get in deep trouble later.
For the money? (Score:2)
In my experience, PhysX has only been a hindrance! (Score:5, Interesting)
Apparently it doesn't do anything crucial or even noticable as my games run just fine with it turned off. And now I'm told the game devs don't even want to use it?
This "feature" has caused me nothing but grief!
Re:What does PhysX do anyways? (Score:4, Insightful)
duh, it's got what gamers crave!
Re: (Score:2)
Girlfriends?
Re: (Score:2)
White Castle [whitecastle.com]
Re: (Score:3, Informative)
Ask, and ye shall receive: http://en.wikipedia.org/wiki/PhysX#PPU [wikipedia.org]
Re: (Score:2)
The benefit: Physics is one of those easily parallelised problems so a very large increase in complexity is possible.
The drawbacks: There's less GPU time available for drawing stuff so your framerate suffers. And of course, it's limited to Nvidia hardware only.
The latter leads to a drawback of its own: the technology can't be used to its full potential because many people who buy a game won't have the necessary hardware. So it can't be used in
Re: (Score:2)
I think it makes you look like a dipshit everytime you provide a link.
You do not provide any sort of answer, and you assume that the person did not already google it.
So lets assume that the person did google this, and did not find an answer that helped him understand the issue. Wouldn't a good next step be to ask others for help? Is Slashdot not a good place to ask the question?
Re: (Score:2)
Excellent reply.
Slashdot brings in geeks from all realms. Some may be smart in one area, but clueless in others with no real knowledge of a subject. Even to the point of "Hey, anyone mind helping me understand this issue?".
I should have re-read the original post on this thread, the original poster was an idiot, but even so telling him to google it would do nothing, especially in this case since he has already used PhysX in games and was mainly stating that he saw no point.
My reply was mainly a knee-jerk a
Re: (Score:2)
It would be coo
Re: (Score:2)
Actually this sounds fairly familiar, there are strong parallels between this and AMD's issue with Intel.
Nvidia is using their marketshare to push forward their software that can only run on their cards by paying companies to use it. If the developers are using Nvidia's solution then they are not using the competitors.
Re: (Score:2)
If Microsoft made computers you would have a point.
I guess your point may have been that Microsoft only made software that works with their operating system? If so, that would also have been slightly incorrect in that Microsoft does put out a version of Office for Macs. They don't do much beyond that for other operating systems so you would have been close.
Not yet (Score:2)
Though the reason for that at this point is the newness of the APIs, not because they can't be used. We'll have to wait a couple years to see if one or both of the technologies take off. Please remember that the OpenCL API didn't get finalized until the end of 2008, and GPUs didn't implement it until several months later. So there has been less than a year that one could develop on real hardware using it. DirectCompute was released with DirectX 11, October of 2009. Also it requires DirectX 10, 10.1, or 11 a