Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Cloud Graphics Intel Games

Cloud Gaming With Ray Tracing 83

An anonymous reader writes "Since real-time ray tracing on a single desktop machine is still too slow for gaming, Intel has worked on a research project that puts the heavy workload up into the cloud, relying on multiple 32-core chips working together. That enables new special effects for games, like realistic reflections (e.g. on cars or the scope of a sniper rifle), glass simulations and multi-screen surveillance stations. The research paper also takes a closer look at various sources of latencies that become relevant in a cloud-based gaming approach."
This discussion has been archived. No new comments can be posted.

Cloud Gaming With Ray Tracing

Comments Filter:
  • Bad Tech Journalism (Score:5, Informative)

    by Sonny Yatsen ( 603655 ) * on Wednesday March 09, 2011 @11:21AM (#35431144) Journal

    First Line of the Article:
    "A new technology from Intel called ray tracing could bring lifelike images and improved 3D effects to games on tablets and other mobile devices."

    GAH!

    • by devxo ( 1963088 )
      Yeah no shit, I programmed 2.5D ray tracing when I was like eight years old. I could model our home and other houses with it and have a few sprites in the "game" and it had a certain 3D feeling, even while I knew nothing about 3D graphics. I think Doom was done similarly.
      • by sockman ( 133264 )

        No. Doom was ray casting, a much simpler technique.

        • That's why he said '2.5D ray tracing' (another way to say 'ray casting').

          • by amn108 ( 1231606 )

            Eh, I don't know about your (if any) experience with computer graphics, but as far as I know, and I did my fair share of CG programming, raycasting has nothing to do with the mentioned '2.5D'. DOOM was '2.5D' because of the pecularities and limitation of its supposed 3-D engine where level maps were stored in memory in such a way (BSP) as to make it impossible to put two floors on top of one another (if I am not mistaken), and it also couldn't handle sloped surfaces. That's why it is called 2.5D. Basically

        • Doom was absolutely not done with ray casting. The scene is composed, piece by piece, by rendering on-the-fly approximations of the walls and ceilings based on a 2D level map. Nothing was truly 3D (so I guess thats where you're calling it 2.5D), but nothing used ray-tracing or ray-casting either. Maybe they threw a few rays around to find which walls it hit, but they didn't do raytracing or raycasting per-pixel to calculate color values. As soon as it knows which wall is where, it just does a fill on that w
    • Even better:

      The research paper also takes a closer look at various sources of latencies that become relevant in a cloud-based gaming approach.

      How about those fucking usage caps???

      • by slim ( 1652 )

        How about those fucking usage caps???

        They don't seem to have put Netflix out of business.

        However, it's true that demand for higher bandwidth applications will drive a market for higher caps, uncapped contracts, and faster pipes.

      • How about satellite users WITH usage caps? Not only do they have to deal with the cap, but a 500ms latency. I know we're getting into appealing to the rural demographic, but farmers have money, too.
    • Didn't you see the word bullshit, spelled "Cloud," in the topic? Unless you see "Sephiroth" with it, you should immediately recognize the fact that nobody knows what they're talking about and they want to sell you garbage.
    • Intel has been trotting this story out every three months or so for as long as I can remember.

      As memes go, "Intel shows fully raytraced game" is right up there with "Duke Nukem is nearly finished!" and "This year will be the year of the Linux desktop".

    • Comment removed based on user account deletion
  • I was unaware Whitted worked for Intel. </sarcasm>

    • I was unaware Whitted worked for Intel. </sarcasm>

      Actually, someone else may have beaten him to it .. but now I can't find the paper that cites the earlier reference.. so this post is a bit pointless :-(

      • I was unaware Whitted worked for Intel. </sarcasm>

        Actually, someone else may have beaten him to it .. but now I can't find the paper that cites the earlier reference.. so this post is a bit pointless :-(

        Oh that's typical... I just found the paper. It's "Interactive Rendering with Coherent Ray Tracing" by Ingo Wald, Philipp Slusallek, Carsten Benthin, and Markus Wagner, published at Eurographics 2001.

  • Can't say I've ever heard of him though. I use to play against someone called Polly but shes gone now.

  • "A new technology from Intel called ray tracing could bring lifelike images and improved 3D effects to games on tablets and other mobile devices." Ray tracing has been around a long time. Even ray tracing in the cloud isn't that new. NVidia has the RealityServer.
  • It's not that impressive, either.

    On the topic of raytracing - one thing that still stands out to me from the images in the paper are the lack of proper occlusion and shadows.

    Take a look at the shot of the close up of the car side - look under the front wheel and it just looks .... artificial.

    Unless there's some magic sauce that can be sprinkled on this without added a frame rate hit this isn't really all that wow at all.

    • Take a look at the shot of the close up of the car side - look under the front wheel and it just looks .... artificial.

      Unless there's some magic sauce that can be sprinkled on this

      Sure. It's called radiosity [google.com].

      without added a frame rate hit

      ...oh

    • by grumbel ( 592662 )

      The core problem is that "raytracing" isn't a concrete thing or technology, its a general purpose technique. It can do pretty much anything from a few ugly shiny spheres to photo realistic rendering and rendering time might vary from fractions of a seconds to days. Just like good old polygonal graphics can do everything from basic stuff like Starfox to the latest Unreal Engine 3 tech demo or full photorealistic movies. Without any clear footage of actual games its really pointless to discuss the issue, espe

  • by airfoobar ( 1853132 ) on Wednesday March 09, 2011 @11:46AM (#35431584)
    That the article thinks "ray tracing" is a new Intel technology, or that it thinks "cloud" rendering is something that hasn't been around for 50 years?

  • I rtfa, and its confusing. It started with talk of cloud computing on mobile devices (with no mention how the constant speedbump of network lag were to be overcome) and then droned on about a new chip architecture.

    Nothing to see here, moving along...
  • This is garbage. Mobile gaming, cloud computing, eh rewriting wolfenstein to add ray tracing in the cloud??? I can see why that might make a POC, but Wolfenstein's not even 3D! "We have a red car sitting at a courtyard, which has a very shiny reflective surface. That can be rendered very good." OK, not speaking Inglish isn't a crime. But the editors should catch this kind of thing. UNworth reading.
  • "A new technology from Intel called ray tracing "

    I stopped here.

  • Isn't this what Sony promised us the Playstation 3 would do, and the supposed reason why they went with the "Cell" processor? Because everything Sony that had to do heavy graphics lifting would have one, and they would all cooperate to make your games better? And of course, this never came to pass, and Sony never really used Cells for anything else (ISTR there might have been a Cell-based Blu-Ray player that wasn't the PS3, but maybe that was just a rumor.)

  • Ah cloud computing... pauses to laugh... Ok, earth to the idiots at Intel - your network latency kills any benefits that could ever be imagined for this system. An average video card nowdays can push 80-100 Gbps, higher end cards are exceeding 150 Gbps and more. Let's look a video cable speeds - HDMI pushes 10.2 Gbps, VGA 10.8 Gbps, DVI 9.9 Gbps . Now lets look at the typical home internet connection today, it's avg 1.5-3 Mbps. Ok, let's do a thought experiment about how this stupid system would work.
    • Not sure what the problem is here, but you can stream 1080p video over a 10 mbit internet connection. Basically, you are just constructing a video, and playing the video back to the person. Properly compress the video with H.264, and there is not problem. Maybe you live in a part of the US where everybody has 1.5-3 mbps connections, but where I live (canada) 3 mbps is actually the lowest anyone sells. And you can get 15 mbps for pretty cheap.
      • Let's pretend you have a 15 Mbps connection (which I would say is a very atypical connection) and you get it down to 1 frame per second. It still doesn't matter and this is an incredibly slow and stupid system. You need network bandwith equivalent to the video cable spec hooked from your video card to your monitor for this system to work. As far as I know, that WILL never happen while any of us are alive. Also, please link this 1080p system that works on a 10 Mbps connection? As I pointed out already,
        • by vadim_t ( 324782 )

          You're joking, right?

          All you need is a 10Gbps LAN. It's expensive, but it can be had today. It's most likely doable over plain gigabit with compression or a reduced framerate.

          Gigabit is already can be had cheaply enough for home usage. 10G will get there eventually, certainly in a lot less than my remaining lifetime.

          • We are talking about broadband and typical internet connections. I don't know where you get the idea that anyone can get a gigabit connection cheaply. Most broadband is T1 or a fractional T3 in speed of 1.5 Mbps-3Mbps in the US. The price is roughly $30-60 per month for that from their cable providers. An OC-24 on the other hand, which is 1.224 Gbps connection, will only cost you ~$100,000 per month to have in the US. Seeing as how I don't have $100k per month the burn, I can't give you an exact figure
            • by vadim_t ( 324782 )

              I can only see this working in LAN settings though.

              The problem isn't with the video, it's the enormous CPU power required. Several very high end machines allocated to a single customer easily for hours, with all customers wanting to use it at about the same time, and at the price of a gaming service? I don't think it would work out at all.

              This would be more useful for some sort of corporate/scientific visualization purpose maybe. For home usage I imagine video cards will get there fairly soon, especially if

        • That's because the HDMI Cable carries uncompressed video + audio data. If you compress the data using H.264 to send it between the cloud and the client PC. If what you say is correct, then I wouldn't be able to play HD Netflix content on my home connection. Nor would I be able to play HD Youtube content. And I clearly can. Because they are sending over encoded video. They aren't sending raw frames. The TV in your living room is a dumb box and can only interpret raw video/audio data, which is why it requ
          • You are confused. HD Netflix is NOT 1080p (it is 720p and sorry to break it to you, most of the titles are only recorded in NTSC - not even 720p). Streaming will NOT work for gaming for a number of reasons (lack of sameness between blocks, can't use buffering, etc). And uncompressing images ALSO takes time too. Sorry, no free lunch and this whole idea is just stupid. Much like cloud computing has been a stupid idea since IBM suggested it in the 1960's and it is still a stupid idea today. If any of thi
      • h264 doesn't work, you need a low latency codec. Computing the motion compensation between N keyframes means you're introducing N frames of latency.

        So you need to transfer still images, encoded in MJPEG or something similar but more advanced. Is it possible?

        of course [microsoft.com] it is [virtualgl.org]! One solution was introduced recently with the windows SP1, the other one is open source and has been available for some years.

        doing it from the cloud (i.e. fancy word for the internet) isn't so interesting, the technology sounds so much

    • by clgoh ( 106162 )

      You should take a look at onlive.com It's exactly that kind of "cloud" gaming, and it exists right now.

      • Now go look for real life reviews of the service. I found easily dozens of user reviews like this one - "OnLive works if you're practically sitting at their server, but it's just not ready for mass market in any way shape or form." or "I doubt this service is really going to take off if it gets released to the public like this. No Wifi support and paying for a blurry video feed of a game isn't exactly "fun"." And these aren't even high end games they are hosting. It's all about the math. I'm shocked su
        • by clgoh ( 106162 )

          Just signed up for a free trial to test it... Yeah it's blurry, but on a shitty DSL connection, the latency was better than I expected.

          But it probably won't go anywhere.

  • When tablet/mobile data plans won't be insanely expensive and when broadband will have no upload/download limits and decent speed, then you can start talking to me about rendering graphics in the "cloud".

  • how great would it be if /. automatically filtered stories which are about imagery but do not in fact have images in them.

  • Is this an actual example of a good usage of the term "cloud"? In the sense of some computers out there somewhere doing stuff for you and you getting the results? Not long ago I heard about the company OnLive and their cloud-based gaming, where all the computing and rendering is done on their servers, you send your control inputs across the net to them and they send you back sound and video.

    Played it not long ago myself and expected the lag to be bad, but it turned out it wasn't bad after all. You can se

    • Look at the real user reviews of this service on the web. It's going down hard. There are a few positive reviews but the majority are negative for a simple reason. IT'S A STUPID IDEA. Rendering a NTSC image of a game on a server by creating a virtual session and then sending it to a user over typical internet connections (btw this thing need a really high end internet connection to even work then) makes no sense. This thing will be dead and buried by next year. Mark my words.
      • NTSC? Those players are using thin pipes, huh?

        Sounds like you haven't tried it. Give it a go, there's a free trial. It's easy to run. I'm curious to hear how you like it.

        I've got a 20 Mbps connection and I'm probably close to one of their servers, so I haven't had network issues at all. You should only be running NTSC rates if you've got a 1.5 Mbps connection. That's well below average these days, isn't it?

        Network speeds keep improving (google "bandwidth over time"). This stuff will only keep getting

  • At 60Hz, one screen refresh is every 16ms, so the rendering takes either 8 to 14 images with 5 images caused by the network RTT..

    Interesting.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...