Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Media Quake Games Technology

RAYA: Real-time Audio Engine Simulation In Quake 89

New submitter bziolko writes: RAYA is a realtime game audio engine that utilizes beamtracing to provide user with realistic audio auralization. All audio effects are computed based on the actual geometry of a given game level (video) as well as its acoustic properties (acoustic materials, air attenuation). The sound changes dynamically along with movement of the game character and sound sources, so the listener can feel as if they were right there — in the game.

This discussion has been archived. No new comments can be posted.

RAYA: Real-time Audio Engine Simulation In Quake

Comments Filter:
  • by Anonymous Coward

    This is what happens when you DON'T open source your games. Your game doesn't make the news when researchers DON'T use your games for research.

    • Id open sources its older code to act as an incentive to developers to license their latest engine. It's the same as MS giving away VS Express or a free hit from a drug dealer. Since most developers are already licensing an engine from someone else there is little incentive to open up their code.

  • by Nyder ( 754090 ) on Sunday August 31, 2014 @02:48AM (#47794229) Journal

    I liked what I heard, but I really like to have a demo of it to check out.

  • by Animats ( 122034 ) on Sunday August 31, 2014 @03:05AM (#47794255) Homepage

    Quake audio consists mostly of footsteps and bangs. This might be fun for, say, GTA IV/V, where the NPCs have conversations to which you can listen if you're close enough.

    • Guild Wars 2 implemented a system like this to dynamically calculate both occlusion settings and reverberation and echo in real-time.

  • Somehow (Score:5, Funny)

    by bragr ( 1612015 ) * on Sunday August 31, 2014 @03:09AM (#47794265)

    Somehow this will cause someone to puke.

    • *BLORPH*

      Oh man! Right in your lap! Sorry about that dude!

      I'll try to aim someplace else next time...


      Well...your least it wasn't your lap this time...

    • Or someone will figure out how to hit the brown note, which will make multiplayer griefing much more interesting.

    • Re:Somehow (Score:5, Interesting)

      by DoofusOfDeath ( 636671 ) on Sunday August 31, 2014 @08:52AM (#47794917)

      Somehow this will cause someone to puke.

      As someone who's worked on 30-year-old acoustic ray tracing software models, the fact that they're attempting to get a patent make me want to puke.

      Fortunately, we can count on the vigilant patriots at the USPTO to view the patent with skepticism, and bring a combination of deep domain knowledge and Rottweiler-like tenacity to look for prior art.

  • by Anaerin ( 905998 ) on Sunday August 31, 2014 @03:20AM (#47794283)
    There was a company back in 1997 that had a fantastic (series of) cards that did all this 3d transformation, reflection, deflection and occlusion of audio in hardware. The company was Aureal, and their A3D [] system was fantastic, doing everything that this demo showed []. The competitor, Creative's EAX, instead used the entirely dumb method of "turn on reverb in a room". Creative sued Aureal, thinking that they had a leg up on 3D audio. Aureal countersued, and won, but the legal costs drove them into bankruptcy. Creative then bought Aureal's assets, and buried the company, and all it's technology, never to be seen again. In fact, EAX is still the stupid-simple (and very broken) "turn on reverb" (though now it also has "Adjust reverb"). And, as Creative have shown before (With the whole "Carmack's Reverse" fiasco []) They're more than willing to use legal means to muscle their way.
  • by plibnik ( 636383 ) on Sunday August 31, 2014 @03:58AM (#47794357)
    For those who play in headphones, not with 5.1 or 7.1 surround audio, a system that tracks head rotation and tilting (similar to what they have for airplane sims, where you wear hat with markers and a webcam tracks your head position... and view in displays is changed accordingly) is needed. I haven't seen any of those at the market yet. Maybe you've heard about such things?
    • by Anonymous Coward

      That might be a built-in feature. The sort of thing that you get "for free" when calculating the sound in this way.
      But unless you are also applying the rotation to the character as it is rendered on-screen, it doesn't make sense to apply it. If the sound you hear is distorted without a corresponding graphical equivalent, it will just sound wrong.

      • by plibnik ( 636383 )
        Well, I do turn my head a bit - maybe 10-20 degrees to sides - when I play some FPS. Just imagine yourself, e.g., strafing out of the corner. Supposedly your EYES are still locked on display before you, yet you can rotate your head a bit. And this should mean a lot in such a detailed environment: make all these efforts to produce a precise acoustic environment and then to fail by ignoring head direction - so that sound would be inconsistent to picture on a fixed monitor before you? BTW, when I was younger
    • by Anonymous Coward

      I never understood those head trackers. I mean, if I have a screen in front of me and turn my head to the right, then the display may very well change but I'm now looking to the right so won't see it (or need to look out of the corner of my eye). If instead I have a bank of monitors, so that I could see any adapted view - it wouldn't need to change the display!
      Whereas, I can see benefit in your suggestion - we tilt our heads to identify sound sources so that would work quite well even without the visual

      • I never understood those head trackers. I mean, if I have a screen in front of me and turn my head to the right, then the display may very well change but I'm now looking to the right so won't see it (or need to look out of the corner of my eye). If instead I have a bank of monitors, so that I could see any adapted view - it wouldn't need to change the display! []

      • by Rich0 ( 548339 )

        The other replies cover the window-like way of doing things. The other way to do it is to make it so that if you look right, the screen rotates to the right. Usually you have a multiplier, so that a small head rotation translates into a much larger rotation on-screen. Looking backwards might only require turning your head 45 degrees, which allows you to still look to the side and see the screen.

        This might sound awkward, but your brain adjusts to it with almost no effort. The main problem I've seen with

      • If, as the post you're replying to implied, you were wearing headphones, it *would* need to change the audio, though.
      • by Anonymous Coward

        Posting as Anonymous simply because I'm too lazy to create an account.

        You should look into TrackIR. It's one of the head trackers we use in the sim community (largely flight sims, but it's used elsewhere as well). Most of these head trackers use accelerated or exaggerated movement for looking around. This allows you to turn your head just a couple of degrees and you may be looking 90 degrees to the side, in game. It's also fully configurable along with a deadzone so the camera isn't constantly twitching

    • by Luckyo ( 1726890 )

      That's not going to happen fast enough. Our hearing is exceptionally sensitive to timing, far more so than sight. As a result, the only way you get this is with speakers.

      I strongly recommend getting a good 5.1 speaker setup if you're into gaming and enjoy positional audio. Vast majority of games nowadays have a proper directional sound implementation in software, so you'll get what you pay for.

    • For whatever reason, it isn't something there's much interest in, but it does exist. I am aware of three options:

      1) The HeaDSPeaker. The cheapest option. A little device from a not very well known company called VLSI Solutions. It handles the head tracking and HRTF, you provide the headphones. Runs about 340 Euro ($450). It can take input either as a Dolby Digital stream, or directly as USB from the computer.

      2) The Beyerdynamic Headzone. This is an all-in-one solution from Beyerdynamic. Has a decoder, HRTF

  • by marcello_dl ( 667940 ) on Sunday August 31, 2014 @04:19AM (#47794383) Homepage Journal

    It is good to give devs the option of realistic audio, but for games in medium - big settings, the relative slowness of sound propagation is a problem. Getting a headshot and later hearing the sound is counter intuitive, at least for the hollywood generations. I guess that realistic effects with no delay in sound propagation is the way to go.

    • by Mal-2 ( 675116 )

      A pretty good rule about hearing gunshots is:

      If you hear the gunshot, they missed you. Even if the bullet is subsonic, you'd feel it before you heard it, if it hit you.

  • How is this at all different from GSound: []

    I even have a basic working implementation of it modded into Arma 3...
  • Aureal Technologies (Score:5, Interesting)

    by Anonymous Coward on Sunday August 31, 2014 @05:37AM (#47794537)

    This (audio raytracing) was done in the late 90s by a company called Aureal.

    Their 3D audio cards were UNBELIEVEABLE. I played the original HL using one - and played CS using them - and they were a game-changer. If you had one, you were 10x better off than someone who didn't. You could tell how the battle outside was going on, by hearing how the people firing were changing position - if your team (you knew which direction they were entering combat from) were firing and moving forward, then they were winning.

    One of the demos was a helicopter, circling the players head. You tracked it with your eyes and mind as it went round - it actually R E A L L Y sounded like a true, physical helicoptor circlng your head.

    The Creative sued them into failure.

    I've never forgiven Creative for this. I've never and will never buy any of their products.

    • by Luckyo ( 1726890 )

      To be fair, EAX, while not actually calculating sound based on geometry, approximated close enough. I tested both Aureal's and Creative's cards back in early 2000 and difference was only noticeable if you really, REALLY focused on it in games that supported both A3D and EAX.

      And most modern games use a full 3D positional audio with echo, reverb and other functions in software that sound almost as good as real ray traced sound on almost any decent sound codec. Though I still run audigy2 in all my gaming rigs

  • by KozmoStevnNaut ( 630146 ) on Sunday August 31, 2014 @07:03AM (#47794701)

    Similar to what Aureal was doing with A3D back in the 90s, but obviously not tied to a specific piece of hardware like back then.

    I enjoyed the Quake 3 demo, but it while it works decently well with just the player in the level, it sort of falls apart during the deathmatch. I think that's probably because the stock Q3 sounds have a bit of reverb baked in. I would love to hear what it would sound like with a complete set of reverb- and echo-less sound effects, so the RAYA can handle everything by itself, instead of working in top of the baked-in reverb.

  • Games (Score:5, Insightful)

    by ledow ( 319597 ) on Sunday August 31, 2014 @08:44AM (#47794899) Homepage

    Realistic sound has been around, as people point out, since the Aureal days. Now, to be honest, it should be baked into every engine and tied to your textures (soft textures absorb sound, shiny textures reflect sound, etc.).

    The fact that it isn't means a couple of things - it's too expensive (which I can't believe nowadays), it adds too much cost to development time (but surely modifying those sounds for echo etc. is more costly than just putting in a pure sound and letting the engine modify it as necessary),, people just don't notice that much, or the patent field is too heavy.

    Take things like TF2, HL, CS, etc. They are all same-engine. They are all 3D open environments. It is vital to know where shots etc. are coming from in order to play properly. But we don't see such audio tricks. That, to me, suggests they aren't necessary or certainly not the right value to waste time on.

    And, to be honest, I watched "ray-traced quake" over, what? Ten years ago? That tech still isn't used in modern games because of the above reasons. It's do-able but expensive, the development time is costly, the effect isn't that much different from pure cheating on the 3D drawing, and it's not in any of the major game engines. This is suggestive of the value of such things being minimal.

    And, to be honest, the realistic-"ness"of a game is the first few minutes of unboxing and then that's it. What destroys your immersion from then on is crappy plot, unrealistic capabilities, and AI that still - to this day - sucks. Fire gun, run around corner, wait for the idiots to pile round. The "better" ones might well throw a grenade but once you know that, you take account of that, and that's the AI beaten. To "win" the AI has to have reactions infinitely better than yours and outnumber/outgun you. Think about the average FPS game - there are several THOUSAND bad guys. And you. And though you might get stuck occasionally, you will win. You can use first-aid kits, they can't. You can lure them into traps, they can't (unless scripted). You can sit and wait them out. You can guess where they will walk next, they forget about you one second after they stop seeing you. It's ludicrous.

    Please stop wasting our game industry by reinventing tech we've had for decades and could put in any game, given time. Let's try and make a game with one, single, scary opponent (and maybe some NPC's to fill in the gaps). A Matrix-like game, for example. Agents are few and far between, maybe one per real player. There is only one that's a real threat. And there's you. And a world that you can both use to your advantage.

    When humans play humans you HAVE to have the same numbers on both sides. When humans play AI, you HAVE to be vastly outnumbered.

    I'd much rather Half-Life 3 had intelligent enemies who will choose to camp the chokepoints and not be lured out, than some fancy water effect or proper audio reflections or whatever.

    You're not telling me that with the CPU/GPU available nowadays, we couldn't make a Quake 1 opponent that - with the same programmed reaction times, capabilities, and facilities available to them as a human player - couldn't be a serious threat. I'd rather play that than yet-another "look how shiny" kind of game.

    • I'd rather have both. This tech has been out since the original half-life days. It is not complex, not programmatically or computationally. The problem is the people driving the design of this system were sued into oblivion by a technically inferior Creative Labs. Realism and immersion are two different things. The ability to be situationally aware with sound is a massive advantage for immersion into a game.

      In summary this tech has nothing to do with studios crapping out poor plots or crap AI. The engines s

    • That is why I say: "The best AI ever designed was a live human opponent"

      Lazy developers & designers would rather jack the hitpoints of a boss up to be 10x your life then to spend time making it behaving in an interesting fashion.

      If you haven't played Dark Souls 1 & 2 with its PvP --- check it out.

  • Never mind making Quake/QuakeII/Quakex give audio cues that match the environment more precisely. When do I get a holosuite? I'd very much like the sound to match the image there, especially for some of the more, er, interesting holosuite programs.
  • The problem has always been that in games audio (sound tracks withstanding) is seen basically as a gimmick. A few games do it well, but for most, it's an afterthought.

    The selling point has been, and always will be, graphics. Some reasons: Humans are predominantly visual, magazine based reviews can't demonstrate audio (this is changing due to youtube and other video reviews), lack of audio hardware that WORKS properly (IE; a sound card that processes EAX/positional audio, speakers to take advantage of it, a

  • I was never fortunate enough to actually own my own Aureal card.

    But I really really can't understand why having bankrupted them, and taken all of their technology, creative didn't do the sensible thing and USE IT.

    Even now A3D is still vastly superior to the latest EAX FIFTEEN YEARS LATER.

  • To me, this demo is serious uncanny valley territory.

    When I was composing MOD music on my Amiga back in the late 80's, I was very much aware of the problem of playing the same instrument on the left and right channels at the same time, especially when doing pitch slides. You got all kinds of weird interference problems, or the audio version of moire effects, if you will. If you were good composer, it could be used to good effect in music in a lot of cases, but most of the time it was a real pain, especial

"No, no, I don't mind being called the smartest man in the world. I just wish it wasn't this one." -- Adrian Veidt/Ozymandias, WATCHMEN