Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Intel Entertainment Games

Intel Purchases Havok 123

Dr. Eggman writes "Gamasutra has the recent announcement; Intel has purchased Havok. 'As the firm noted, Havok 5 features enhancements to its core products, Havok Physics and Havok Animation, and introduces new features for Havok Behavior, a system for developing event-driven character behaviors in a game. Some of the games using Havok technology, particularly its Havok Physics solution, include BioShock, Stranglehold, Halo 2, Half Life 2, Oblivion, Crackdown, and MotorStorm - the company is also rapidly developing and marketing further tool products.' No word on what (if anything) Intel plans to do with its new acquisition."
This discussion has been archived. No new comments can be posted.

Intel Purchases Havok

Comments Filter:
  • by $RANDOMLUSER ( 804576 ) on Sunday September 16, 2007 @10:26PM (#20631759)
    Intel's gonna do what Intel always does - they're gonna turn that stuff into silicon. Expect a physics engine chip from Intel.
    • by SuluSulu ( 1039126 ) on Sunday September 16, 2007 @10:36PM (#20631823)

      Intel's gonna do what Intel always does - they're gonna turn that stuff into silicon. Expect a physics engine chip from Intel.
      In other words Intel is going to reek HAVOK on AGEIA.
      • Re: (Score:2, Funny)

        by Anonymous Coward

        Intel is going to reek HAVOK
        No Intel mostly just reeks. Perhaps you meant wreak HAVOK?

        Sincerely,

        your friendly AC spelling Nazi
        • Thanks, AC spelling Nazi. I'm glad there is someone standing up for us goose-stepping book-readers and our desire to rule the world through precise communication.

          • Godwin's law. Game over
          • by Pojut ( 1027544 )
            I could be wrong, but I believe your signature is the first Red Dwarf reference I have seen on Slashdot...
          • Speaking as a professional communication nazi, I have to say that it's our desire for precise communication that prevents us from ruling the world. First of all, being a power-mad fascist dictator requires you to be really good at ambiguity, so that everybody thinks you're on their side, and so that you don't have to take the blame for your own mistakes.

            Besides, people find all those corrections to be really obnoxious. That's why we can never raise the army of rabid followers you really need for world domin
    • by edwdig ( 47888 ) on Sunday September 16, 2007 @10:39PM (#20631843)
      Intel's gonna do what Intel always does - they're gonna turn that stuff into silicon. Expect a physics engine chip from Intel.

      Quite the opposite. Intel's going to work on making it scale well across multiple CPU cores so that gamers will want to buy quad core CPUs.

      Making you want to replace your CPU more often is much more attractive to Intel than starting a whole new completely unproven niche hardware line.
      • Actually... (Score:5, Insightful)

        by Svartalf ( 2997 ) on Sunday September 16, 2007 @10:59PM (#20632007) Homepage
        They're doing something along those lines with the GPU space. Technically, a GPU is little more than a stream processor. Something
        you can do 3D graphics with, or DSP, or Physics, etc. I still have to wonder what they were thinking when they snapped up Havok.
        They are in the Silicon business predominately- doing some specialized libraries that help highlight their chips that occasionally
        get used, mostly because while it makes Intel's chips look good, they don't do as hot on all things with AMD CPUs. So, typically,
        people avoid their libs for anything production like a game.
      • Yep (Score:5, Insightful)

        by Sycraft-fu ( 314770 ) on Monday September 17, 2007 @12:17AM (#20632461)
        PPUs are a dead end thing and Intel is quite aware of that. They aren't a big enough improvement over what a good multi-core processor can do. They also suffer from the "chicken and egg" support problem. Even if the PPU was way above what a processor can do, does a game bother? The problem is that you can't very well go and use the PPU for physics that affect the gameplay. That would mean you'd have to restrict the game to only PPU owners, who are too small a number to make that economical. So that means you have to restrict it to showy physics, things like more fragments in explosions and such. Fair enough, but most people won't buy a card for that. I mean if you've got $300 to blow what makes for better eye candy: A PPU that makes some physics related things look at little better or a high end GPU that makes EVERYTHING look better?

        As such it is extremely hard to get it to go past the critical mass where enough people have them that you can start requiring them in games for core gameplay. Thus it makes sense to just start taking advantage of the increasing power in CPUs and use that instead.
        • Re:Yep (Score:5, Insightful)

          by Tim C ( 15259 ) on Monday September 17, 2007 @02:05AM (#20632997)

          They also suffer from the "chicken and egg" support problem.


          So did GPUs when they first came out, of course. What happened then (if you remember) is that games shipped with the option for software rendering or hardware-accelerated rendering for quite a long time. Some older games had patches released to enable hardware rendering (eg Quake, Tomb Raider). I still remember the first time I saw GLQuake running on my housemate's PC. Of course, he didn't have an accelerator, so while it was beautiful, the one frame every few seconds he got was totally unplayable...

          Anyway, that said I do tend to agree that physics accelerators simply aren't going to go anywhere any time soon, if ever. GPUs make a huge difference, but PPUs? I don't see it.
          • 1) The game played the same in software and hardware, it just looked better with hardware. Graphics can be scaled up and down a large amount without affecting gameplay. That's not true of physics as much. There are things that you can turn on and off, but a good deal of it affects gameplay and thus can't be optional. You can't very well have a racing sim that cars handle on way for people with physics cards and another way for people without.

            2) Graphics accelerators made a MASSIVE noticeable improvement on
            • > was the difference between playing Quake in 640x480 with a slightly jerky frame rate, and sparkly pixelated textures and playing it dead smooth at 800x600 with nice,

              Actually it was Quake 1 on a Pentium Pro 200,

                VGA 320x240
                3Dfx 512x384 (or 640x480)

      • Re: (Score:2, Interesting)

        by UltraAyla ( 828879 )

        I think you're likely right. They can just use this acquisition to sell product.

        However, I wonder if they might do something along the lines of creating "Havok accelerated quad cores" (or whatever). Pull a Microsoft/Internet Explorer and bundle a bit of hardware Havok acceleration into all (and it really must be all or close to all of their high end chips) of their chips. This makes a large portion of the market adopt it by default (large enough to make companies code for optimizations just like with Nvidia

      • Re: (Score:3, Interesting)

        by Spy Hunter ( 317220 )
        I think they're going to make it run on Larrabee [wikipedia.org], their in-development x86-based graphics card to compete with GeForce and Radeon. It's hard to imagine a more perfect match, actually.
      • Re: (Score:3, Insightful)

        by Yvanhoe ( 564877 )

        Quite the opposite. Intel's going to work on making it scale well across multiple CPU cores so that gamers will want to buy quad core CPUs.
        And preferably in a way that makes it incompatible with AMD chips. Better yet : compatible but 10x slower.
      • Intel's gonna do what Intel always does - they're gonna turn that stuff into silicon.

        More importantly, they'll get that silicon into the hands of consumers. So we can expect real people to have this in their boxen. Hopefully some one from Xorg will start putting physics into their desktop.

        I hope Intel turn physX into the next MMX, even my dad was saying he needed MMX in his next PC. No idea what it was, but Intel marketing had him convinced he needed to upgrad, and upgrade NOW damnit.

      • by Ant P. ( 974313 )
        I'd much prefer they added the physics stuff as part of SSE6 or whatever than a proprietary add-in card.

        No drivers required is far better than no drivers available.
      • I couldn't have said this any better.
      • Makes sense if physics takes off, if there are $200 ageia cards and Intel ships high end motherboards with onboard physics (like onboard sound) they'll dominate. The same thing as AMD is trying to do with graphics (Fusion).

        If I can buy an integrated graphics solution through AMD or an integrated physics/graphics solution through Intel it's an easy choice, fusion will provide crap graphics and graphics will have a faster upgrade cycle.

        There's some really interesting video's floating around about Intel in
    • by KDR_11k ( 778916 )
      Looking at what Intel did in the past they're more likely to modify Havok in a way that "accidentally" gives abysmal performance on AMD CPUs.
      • by Kelbear ( 870538 )
        That was my first impression. I don't cherish the thought of AMD sinking and losing the one other option on the market.
    • Shut 'em down, shut 'em, shut 'em down.
    • the irish times is reporting...

      "Intel agreed to buy 100 per cent of the animation software company Havok,
      the name that Telekinesys trades under, for about 79.2 million (euro) cash in a deal expected to close within five days."

      AFAIK
      havok is an irish company spun out of Trinity Collage Dublin. www.tcd.ie

    • by Nosklo ( 815041 )

      Intel's gonna do what Intel always does - they're gonna turn that stuff into silicon. Expect a physics engine chip from Intel.
      It would be more interesting if they just open-sourced the whole thing, but unfortunately this is very improbable.
  • Why...? (Score:2, Interesting)

    No word on what (if anything) Intel plans to do with its new acquisition.
    That's an easy one: Make it run artificially worse on AMD processors. (See also: Skype)
    • Re:Why...? (Score:5, Interesting)

      by pchan- ( 118053 ) on Sunday September 16, 2007 @10:50PM (#20631951) Journal

      That's an easy one: Make it run artificially worse on AMD processors. (See also: Skype)
      You mean they will kill the port to AMD's ATI-brand GPUs, which are moving into physics simulation and need the Havok engine which runs many games.
    • Is that legal? I thought you could make *optimize* something for your own hardware instructions, but can you legally make it worse for other people? And how will we ever know if this happens (there is no benchmark for "havok" engines I think).
      • Sure they can.... They are not a monopoly and as such standard rules apply. They might simply drop AMD compatibility completely.
      • Re:Why...? (Score:5, Informative)

        by Anonymous Coward on Monday September 17, 2007 @02:22AM (#20633095)
        They did it with their C compiler in the past. When they detected anything but an Intel processor they didn't use the SMD instructions even when the processor indicated full support for SMD *IN THE INTEL DOCUMENTED WAY* You can Google for the details.
        • They did it with their C compiler in the past. When they detected anything but an Intel processor they didn't use the SMD instructions even when the processor indicated full support for SMD *IN THE INTEL DOCUMENTED WAY* You can Google for the details.

          Or you can clicke here [swallowtail.org].

        • Well, time to burn off some karma: I actually don't have a problem with this. Why should Intel support AMD?

          Intel bore the costs of the x86 R&D, and the costs of marketing the platform, AND the costs of writing an extremely good C compiler. When AMD makes a copycat chip, it's no surprise that they can undercut Intel because they don't have any of those overhead costs. I don't have a problem with AMD legally reverse-engineering the x86, but they have no right to claim foul because they were too ch
      • Is that legal? I thought you could make *optimize* something for your own hardware instructions, but can you legally make it worse for other people?

        Hi there, you must be new to Reality(TM). We can see the problem here. Youre confusing the word *legal* with *moral*. Its a common urban myth that legality and morality are two sides of the same coin. Thankfully, this is not the case, and this Re-Education Reminder is a friendly reminder of such facts. Welcome to the 21st century, enjoy your stay. This post sponsered by Halliburton, "Unleash the Energy."(TM)

    • by r606 ( 600880 )
      I keep seeing the word "gaming" applied to a lot more than XBox type activities. World stock markets are being looked at in terms of gaming theory. Might they be going in that direction?

      • I keep seeing the word "gaming" applied to a lot more than XBox type activities. World stock markets are being looked at in terms of gaming theory. Might they be going in that direction?

        Applying game theory to stock markets and other similar things isn't new. They were applying game theory to things like that before there were computer games of any kind.
  • Of course (Score:5, Insightful)

    by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Sunday September 16, 2007 @10:37PM (#20631831) Homepage
    Best way to get ahead in reviews? Optimize a common CPU-intensive component for your products. So long as they provide a generic implementation compatible with competitors' products, game developers will stay happy. But they'll still get that extra FPS lead that ensures benchmark scores over AMD, and a few FPS is all it takes.
    • Isn't it cheaper just to provide excellent tech support to Havok and a bunch of other companies though?

      Seems like Intel should just hire a guru per third party who can help the programmers at there optimize for Intel CPUs. Since Intel has more cash they can just outspend AMD on this, and their CPUs will do better in benchmarks because all the inner loop stuff is running as well as it possibly can.

      Buying the company seems like you want to fold some of its technology into some Intel product, or you want to em
    • Re: (Score:3, Informative)

      by TheRaven64 ( 641858 )
      It's all about power consumption. For the last 20 years (longer, actually), there's been a trend towards dedicated silicon for various tasks, then back to putting it on the CPU when the CPU was fast enough (because one chip is cheaper than two). Then something changed. People started caring that their CPU was using 100W. GPUs are a very different architecture to CPUs. Even when CPUs run fast enough that you can get away without a GPU, you will probably still want one because the GPU will use a whole lo
  • Physics acceleration on-die, here we come! Perhaps 'AI' acceleration as well?
    • It didn't work out and they had to destroy their secret lab [google.com] before it got further out of control.

    • Re:Awesome (Score:5, Insightful)

      by Anonymous Coward on Sunday September 16, 2007 @11:38PM (#20632233)
      There's no point in trying to accelerate game AI since even the most sophisticated game AIs are really very simple. Most game AI is just finite state machines. Each state typically corresponds to a simple behavior (e.g. patrol this area, pursue the player, etc.). More sophisticated game AI generally just means more states.

      Even so-called learning AIs typically consist of changing the frequency with which different preset behaviors are used.

      Only games like Civilization where there are a lot of choices to be made can really saturate a processor with AI tasks. And even those aren't that complicated; they just have a lot of stuff to do.
      • There's no point to accelerating the core logic, but try to think outside the box. There could be wins in accelerating other tasks which would allow the core logic to make better decisions. For example, you could accelerate computing visibility to find better hiding places, or accelerate pathfinding to get better paths for the core logic to choose from, or accelerate procedural animation to allow for more varied and realistic behaviors. A product which did these things could credibly be marketed as an "A
        • That kind of stuff would also save time, after the considerable initial time spent building the AI system, on content creation by no longer having to build in that kind of stuff in the levels themselves (then playtesting it and revising). You could just make a level, drop them in, and they'd goto work, assuming the AI was solid. Looking at products like euphoria [naturalmotion.com], there is clearly a market for some incredible middleware packages to be developed in this area, making the potential investment for game devs much
          • Absolutely. I'd go so far as to say that Euphoria, or something like it, will be the next leap in gaming immersion, now that we've long passed the point of diminishing returns in graphics. You can't have truly convincing human characters in games without something like Euphoria. But first they will have to get it to do more than just make guys fall over in more varied ways. The other day I found some really interesting research into generating dynamic walk animations [washington.edu]; something as simple as this (well i
            • I read somewhere about a middleware package that did accomplished dynamic walking. IIRC it accepted some form of skeleton and managed to create a walk from that through some form of AI learning. I distinctly remember a quote from a spokesman from the company admitting they didn't really know how their software works! It just does. I don't think I've heard anything else about it since, this was a few years ago, so I assume the walk cycles it created were unacceptably clumsy or that game devs just don't know
      • This isn't true. Many game still use finite state machines (FSM), but you also see things such as Bayesian Networks, Blackboard Architectures, STRIPS, etc.

        Even if a game just used FSM, don't you think it would make sense to accelerate that? If I wanted to simulate New York City using only FSM, I would have to do thousands (millions?) of calculations per frame to get the behavior right.

      • I dunno. However many AI tasks it would take to keep my "enemies" from rubbing their faces into a wall, while I basically just put them down like a mad cow, would be right by me.
      • Decision logic in games is generally simple.

        Gathering data for those decisions isn't. Pathfinding is intensive, and if you've got cycles to burn you can just up the resolution in your pathfinding space. Things like visibility checks are also an area where you can burn basically as many cycles as you want. You can make do with less raycasts, but more raycasts can get you a better picture of the surrounding environment or enemies or what have you.

        That said, the main bound on AI is usually not processor tim
    • Re:Awesome (Score:5, Insightful)

      by TheLink ( 130905 ) on Monday September 17, 2007 @01:36AM (#20632891) Journal
      The only people who really would enjoy games with smart AI are a subset of people who play chess and similar games.

      The rest would grumble that the game is too hard. Most humans can't beat a single really really good AI, or thousands of weak AIs. So why bother accelerating AI.

      Most people want games that are fun. Just some clever heuristics will be good enough.

      I play guild wars and the "heroes" (computer controller teammates) are better than most random humans (in fact they do a lot of things better than I do - I can't multitask well, have slower reflexes etc), and they could be made much smarter (they tend to cluster together and get nuked), but that would take the challenge out of the game, unless the opponents are made equally intelligent, in which case it would be battle of the AIs with the humans being insignificant, and thus not much fun for the humans.
      • Accelerating AI wouldn't necessarily make it more difficult to beat in most games, just more random/lifelike and lower the massive skills chasm between playing against a computer and playing against a human. Wouldn't it increase the fun and replay ability of games to have more intelligent AI?
      • Re: (Score:3, Insightful)

        by PhoenixOne ( 674466 )

        Yes, because the only people who could enjoy AI that did more than basic pathfinding and state-tree seaches are chess players(???)

        Good AI != Tough AI. With today's technology, you can easily make an AI that always knows where the player is, always selects the best weapon, and always hits the target for "massive damage"(tm). This is trivial. The only reason you don't see this is because the game wouldn't be any fun.

        The trick is making AI that is interesting, fair, and fun. In an FPS I don't want to have

        • by TheLink ( 130905 )
          Sure an AI that always knows where the player is would be cheating.

          But if you're not looking for "hard", or "pass the turing test" and just looking for "fun", then you don't have to do much, just need a few clever tricks and that's it. Not much CPU needed.

          In Guild Wars, there are already enemy AIs in certain PvP arenas which are not that easy to beat, and I'm sure they could make them harder - but what's the benefit to the game maker? Those AI opponents even say GG when they win, I'm sure they could code th
          • I'm not sure where to start with this. I'll just say that you have an "interesting view" on what it takes to make game AI fun and leave it at that. :)

            • by TheLink ( 130905 )
              Well, it's too hard to make a turing complete AI :). I suggest that most people would be happy with clever tricks which game makers can think of and code in.

              It's probably too little gain to do it in specialized hardware unless people can think of a good way of making a game AI that works well on specialized stuff but not general CPUs, that is much better than doing AI on general CPUs.

              If people want intelligent nonhuman entities, I suggest they get one from their local pet store ;).

              Maybe game makers could m
              • A 'Turing complete' AI isn't an issue, but I think you mean one that passes the 'Turing Test'. This has been done already, and I think the Lovelace Test has been passed as well.

                But this is beside the point. Making an AI more human (or more dog-like) isn't always the best goal. For games, it's all about more fun.

                (And I would suggest that you play catch with your dog, not Halo3. ;) Even if you could train a dog to play deathmatch, an AI opponent would be much more fun.)

      • You're assuming that computers can't be better than humans already? That's easy? If every mob talked to each other (at computer speeds and swarmed you, you'd be dead.

        Better A.I. doesn't mean harder A.I. it means more depth of A.I. mobs not just running in circles, bots that act more human. Friendships between monsters (Or bots in an FPS etc.)

        Think of what better A.I. could do for the most popular series of all time, the Sims.

        Decision trees are hard, hard to code tough on processors... anything that c
  • Now, when I shoot my Intel chip with a rocket launcher or maybe ride it off a cliff and fall off, it'll look much more realistic. Lets see AMD do that.
  • This will no doubt be used by Intel to tout future multi-core CPU releases. As it now has the ability(by force if necessary) to make the havoc engine run even better on their chips. One could dream up many long term ramifications of this.

    - Partner with MS to integrate havoc engine into future DirectX releases.
    - Realistic chance of Physics on GPU standard (AMD/nVIDIA purchase licensing)
    - Potentially hurt 3rd parties that use the engine on other chips. (Cell/PhysX)
    - Spur next generation of physics engine that
  • by Samir Gupta ( 623651 ) on Sunday September 16, 2007 @11:26PM (#20632173) Homepage
    All game consoles of the current generation use non-Intel chips. Amongst games devs, Havok are reowned for their quality technical support, and the work they put into tweaking their physics engine for all the platforms, Intel PCs, AMD, and PPC consoles.

    What's to say Havok won't "focus" their optimization efforts in the future on Intel exclusively?

    This is sort of like what Sony did with SN systems (a very good maker of third-party dev tools for consoles) and then dropping all support for non-Sony platforms.
    • I hope they won't, but I agree with mixed skepticism. It would be bad for Havok's business to drop 3rd party support, but it would not be unlike any large corporation to gain an asset that was previously neutral to add favor to themselves (and probably abuse such a system for 5-10years till they slip and do something illegal). Curious, what did Havok do for PPC?
    • Re: (Score:3, Funny)

      by Goalie_Ca ( 584234 )
      Interesting words coming from "Head, New Technology Research Group, Nintendo Co. Ltd. I suppose you keep your eye on this sort of thing.
    • by jeffbax ( 905041 )
      I don't think that'll really be an issue. Seriously, all three consoles are using PowerPC, and chances are their successors will too because it makes backward compatibility easier. You think Sony's ever going with an Intel chip? The console games market is way bigger than the PC market, and I'm sure Havok makes more money from their console business than the PC and it would be foolish for them to ruin that and give the market to competitors like Ageia who would gladly steal it.
    • by LetterRip ( 30937 ) on Monday September 17, 2007 @01:26AM (#20632829)
      Perhaps they can look at Bullet - http://www.continuousphysics.com/Bullet/ [continuousphysics.com] a high performance cross platform physics library that is open source. I know it is optimized for XBox 360 and PS3 and I'm pretty sure it has been used for first tier games on both. Not sure though if it has been optimized for the Wii though.

      LetterRip
  • And look forward to Intel reaffirming their monopolistic status.
  • by MSRedfox ( 1043112 ) on Monday September 17, 2007 @12:13AM (#20632431)
    Both ATI and Nvidia's GPU based physics acceleration were being made to work with Havok. ATI was working on a 3 card Crossfire rig, 2 for graphics, 1 for physics. I wonder what this will mean for future developments. http://ati.amd.com/technology/crossfire/physics/index.html [amd.com]
    • They will probably continue to work on that, as they still want to sell Havok so they have to make a product that's attractive to their customers. However, when Intel's new graphics card [wikipedia.org] comes out, expect Havok's support for it to be especially good.
    • It will do very little to nothing. The Ageia and CPU physics are performed in main system memory so the game can interact with them and they can interact back with the game. The physics on the GPUs exist only superficially in the GPU memory so only the GPU, not CPU, can interact with them individually or tell them what to do. All the CPU and game can do is tell the GPU which areas to push, pull, modify, spawn or kill them and how they behave, but the CPU can't look up individual objects and their positions.
    • Both ATI and Nvidia's GPU based physics acceleration were being made to work with Havok.

      I think you've hit on the primary motivation here. This may have been done as much to hobble Fusion [wikipedia.org], as to support Nehalem [wikipedia.org].

  • ODE (Score:4, Interesting)

    by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Monday September 17, 2007 @03:24AM (#20633449) Journal
    I'm amazed how rarely it gets mentioned, but you know, there is an open competitor [ode.org], sort of. I say "sort of" because I've never actually written a game that needed physics, so I don't know whether ODE is to Havok as OpenGL/SDL is to DirectX/D3D.

    Also raises the question: Will Intel force everyone to use Havok to take advantage of any physics-related silicon they develop? Or will they be friendlier to ODE? Or will they not create any physics-specific silicon, and make this whole discussion moot?
    • ODE is good but, like most open source projects, I think it tries to be too much. It also has some fundamental issues that haven't been worked out yet.

      Havok has great game support (for people who can afford it) including support for PS3 and XBox360. It does one thing, game physics, and it does it well.

      IMHO

    • Well, S.T.A.L.K.E.R. uses ODE for its physics, and I thought it did a pretty good job in that dept.
    • Re: (Score:2, Informative)

      by Zeussy ( 868062 )
      I have used ODE and I was actually quite disappointed with its performance compared to that of Bullet. Its feature set isn't as rich either. Convex hulls are not fully supported yet, and they are not very optimized. So far the best physics engine I have used is Bullet.

      Bullet is open source, fast, feature rich. Supports Stable stacking amd even moving concave hulls.
      ODE is open source but I found it slow, and a little feature poor.
      Newton is closed source but free. I found you could easily bog it down and r
      • Now moving into the realm of I-have-no-idea-what-I'm-talking-about. That said, here's something from the ODE mailing list:

        Bullet Collision Detection can work with ODE, or it can use Bullet's native
        Dynamics. The Dynamics part of Bullet is very limited. No limits, no motors,
        only point to point and contact constraint. It uses a sequential impulse
        based method which is very similar to PGS in the end.
        • by Zeussy ( 868062 )

          Now moving into the realm of I-have-no-idea-what-I'm-talking-about.

          I hope you are talking about yourself there. If you read the date that mail is from it is Feb 2006, a lot has happened in that 18 months. If you care to take a look at Bullet's feature list [bulletphysics.com] : you will see that is now supports:

          Projected Gauss Siedel (quickstep)

          and

          Generic 6 Degree of Freedom Constraint , Motors, Limits

          In a recent project of mine I created an ODE implementation, but it was painfully slow. I changed my implementation to use Bullet and got about a 5x improvement in performance. For me Bullet was superior, espically in the realms of convex collisions & dynamics.

          • I hope you are talking about yourself there.

            Yes, I was. I wanted to acknowledge, up front, that I was quoting / linking to something I don't really understand.

            I guess that answers the question of an open API -- I'll bet ODE and Bullet are not drop-in replacements for each other, meaning we have a ways to go before we can do this as generically as we do graphics (OpenGL).

            Sad, though.

            • by Zeussy ( 868062 )
              Actually, their API's are very similar to each other. Bullet was very close to a drop in replacement to ODE, but as all physics API's simulate physics in a similar fashion all their API's are reasonably similar.
              I'm sorry I misunderstood your statement at the beginning.
              I would say that ODE and Bullet are much more similar to each other in API's than OpenGL and Direct3D are.
              • Fair enough, but what I want is to be able to say with confidence that we've either decided to standardize on one physics engine, or we've got several which are as similar as nVidia's OpenGL implementation is to ATI's OpenGL implementation.

                It's not so much a question of difficulty of porting, I'm thinking back to this being an alternative to proprietary lockin to physics hardware, should it ever be useful.
                • by Zeussy ( 868062 )
                  Yeah I know what you are saying, it would be nice to have OpenPhysics where we have vendor specific implementations.
  • It would be nice if Intel nudged them into supporting other OS's than just MS-Windows. Intel does seem to be multiplatform friendly in other realms. With AMD responding to Intel by opening up ATI, it is a good trend.
  • According to reports Intel will buy 100 per cent of Havok, a.k.a. Telekinesys, for about 79.2 million cash in a deal expected to close within five days. That's a good price for a company that produces the best Physics and Animation Simulation technology on the market. The big win for Intel is the staff at Havok, they are some of the best engineers in the business, that is a great addition to Intel and I'm sure it makes for exciting times ahead for Intel's 'Gaming' plans. This is a great purchase for Intel a
  • by mattr ( 78516 )
    Easy, they will make more works of software that use those engines, and will boost those engines so they continually require the most cutting edge cpus.

    Intel invests in companies that develop products which make people want to buy higher end chips, for example physics-based acoustic instrument simulation like one company I know.
  • Isn't there a large market for this in the supercomputer/rendering arena? Stuff like realistic-behaving orc armies and that type of thing. Maybe they're planning to integrate this into their high end offerings (Havoc aboard the Itanic?).

    Just an idea.

    • by Kelbear ( 870538 )
      I would surmise that there are probably better solutions for those special effects houses. Havok is nice for games which need to be handled in real-time, but rendered movies have more time to work their calculations and would probably want accuracy and realism over efficiency. So I'm going to guess that Havok isn't aiming for their needs.

      For example, running is complicated physics but it's theoretically doable. But in games you want to save overhead so instead of a long formula drawing in all the factors, g
  • by bogie ( 31020 )
    This was used to make some of the very best games ever made on multiple platforms. I'm sorry to see it get snapped up by the Borg of silicon. Although I'm probably not nearly as sorry as the companies who currently have games in production using Havok. Games that used Havok http://en.wikipedia.org/wiki/List_of_games_using_physics_engines#Games_using_Havok [wikipedia.org]
  • This is about Havok's investors finding an exit strategy, I expect. Havok isn't very profitable, and they had to shrink the company considerably a few years back. Game middleware just isn't that profitable a business. Havok found new investors and hung on, replacing their top management, but the new investors need to cash out at some point. This is it.

    The other major player in this space was Mathengine, which was a dot-com of sorts - too much initial investment and too little revenue. EA acquired them

Technology is dominated by those who manage what they do not understand.

Working...