Cloud Gaming With Ray Tracing 83
An anonymous reader writes "Since real-time ray tracing on a single desktop machine is still too slow for gaming, Intel has worked on a research project that puts the heavy workload up into the cloud, relying on multiple 32-core chips working together. That enables new special effects for games, like realistic reflections (e.g. on cars or the scope of a sniper rifle), glass simulations and multi-screen surveillance stations. The research paper also takes a closer look at various sources of latencies that become relevant in a cloud-based gaming approach."
Bad Tech Journalism (Score:5, Informative)
First Line of the Article:
"A new technology from Intel called ray tracing could bring lifelike images and improved 3D effects to games on tablets and other mobile devices."
GAH!
Re: (Score:2)
Re: (Score:1)
No. Doom was ray casting, a much simpler technique.
Re: (Score:2)
That's why he said '2.5D ray tracing' (another way to say 'ray casting').
Re: (Score:2)
Eh, I don't know about your (if any) experience with computer graphics, but as far as I know, and I did my fair share of CG programming, raycasting has nothing to do with the mentioned '2.5D'. DOOM was '2.5D' because of the pecularities and limitation of its supposed 3-D engine where level maps were stored in memory in such a way (BSP) as to make it impossible to put two floors on top of one another (if I am not mistaken), and it also couldn't handle sloped surfaces. That's why it is called 2.5D. Basically
Re: (Score:2)
It's not the direction that counts actually, it's the steps involved - raycasting does not compute any rays between scene objects themselves, only rays that cross a given projection plane and a source of light or a scene object. Raytracing adds to this interplay between scene objects in the form of rays that are emitted in pretty much all directions from each point on any scene object, ideally even INSIDE it (light penetrates human skin f.e.) This makes raycasting immensely faster than raytracing (which is
Re: (Score:2)
you're thinking about global illumination.
illuminance on a surface is computed simply by the surface normal (or per-pixel interpolated normal) and it's angle relative to all applicable light sources. no raytracing there.
backward ray tracing is shooting off a ray for every pixel in the camera's projection plane and checking for intersections (and optionally interactions with objects).
forward ray tracing is shooting off a bunch of rays from each and every light source and bouncing them around the scene. thi
Re: (Score:2)
Yes, you're absolutely right. But I was simply talking about how raycasting stops where raytracing truly begins (forward or backward alike.)
Re: (Score:2)
Well, at least I corrected OP notion that '2.5D raytracing is raycasting' which is completely without merit or sense. I was explaining why Doom/Wolfenstein engines are called 2.5D - yes its because among other things Wolfenstein used same wall height everywhere and Doom went a step further and carried the height in the map, something that also made impossible to have two floors on top of one another. I don't see why you paint me clueless here. Yes, when I was experimenting with raytracing algorithms back in
Re: (Score:2)
Re: (Score:2)
Even better:
The research paper also takes a closer look at various sources of latencies that become relevant in a cloud-based gaming approach.
How about those fucking usage caps???
Re: (Score:2)
How about those fucking usage caps???
They don't seem to have put Netflix out of business.
However, it's true that demand for higher bandwidth applications will drive a market for higher caps, uncapped contracts, and faster pipes.
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
To the cloud!
*rips out eyes/ears and smashes his head through some Windows*
Re: (Score:2)
Intel has been trotting this story out every three months or so for as long as I can remember.
As memes go, "Intel shows fully raytraced game" is right up there with "Duke Nukem is nearly finished!" and "This year will be the year of the Linux desktop".
Re: (Score:1)
Re: (Score:2)
I wouldn't want my games up in Intel's cloud somewhere where I don't have any control and where I have to rely on my ISP to provide good latency. But it might be interesting to me to have a single powerful home server and then a couple laptops and a couple tablets that are basically IO devices and little else. Granted, you couldn't have everyone doing demanding, graphically intense games at all times, but a reasonably powerful desktop server should be more than capable of rendering 2x1080p laptop screens
Re: (Score:2)
Intel wants to find a way to remain relevant, if they can come up with some clustering secret sauce then there is a reason to continue to buy products based on their technology. The PC Gaming market is in decline, the only space in computing that is growing rapidly is mobile computing, and corporations work on the model of boundless expansion. They need to conquer new markets to continue to exist. If Intel can get you to buy an intel-based phone, desktop, and tablet because they will cluster then it's a goo
Re: (Score:2)
To me it sounds more like the Intel Larrabee division has moved to the 'cloud'. Apart from that it's just a repeat from 2006, 2007, 2008, 2009...etc.
Intel? (Score:1)
I was unaware Whitted worked for Intel. </sarcasm>
Re: (Score:2)
I was unaware Whitted worked for Intel. </sarcasm>
Actually, someone else may have beaten him to it .. but now I can't find the paper that cites the earlier reference.. so this post is a bit pointless :-(
..typical... (Score:2)
I was unaware Whitted worked for Intel. </sarcasm>
Actually, someone else may have beaten him to it .. but now I can't find the paper that cites the earlier reference.. so this post is a bit pointless :-(
Oh that's typical... I just found the paper. It's "Interactive Rendering with Coherent Ray Tracing" by Ingo Wald, Philipp Slusallek, Carsten Benthin, and Markus Wagner, published at Eurographics 2001.
Re: (Score:1)
Turner Whitted An improved illumination model for shaded display Communications of the ACM, v.23 n.6, p.343-349, June 1980 [acm.org]
Re: (Score:1)
Sigh... (Score:2)
If you read the paper I mentioned, you will see it, in turn, cites "A. Appel. Some techniques for shading machine renderings of
solids. SJCC, pages 27–45, 1968."
Ray Tracing? Is he a good opponent. (Score:2)
Can't say I've ever heard of him though. I use to play against someone called Polly but shes gone now.
New Technology? (Score:2)
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
http://www.youtube.com/watch?v=tCMo-bJQC8A [youtube.com]
Heaven 7 by Exceed.
64kb executable. Raytraced. Over 10 years old.
Oh, and while wer'e doing cool demos that use raytracing : http://www.youtube.com/watch?v=EK7jkVAvA_Y [youtube.com] (pouet link : http://www.pouet.net/prod.php?which=49856 [pouet.net])
Re: (Score:2)
Raytracing in 64kB 10 years ago is definitely cool, but they were raytracing in the early 80s.
So Intel, we finally get to see Larrabee eh? (Score:1)
It's not that impressive, either.
On the topic of raytracing - one thing that still stands out to me from the images in the paper are the lack of proper occlusion and shadows.
Take a look at the shot of the close up of the car side - look under the front wheel and it just looks .... artificial.
Unless there's some magic sauce that can be sprinkled on this without added a frame rate hit this isn't really all that wow at all.
Re: (Score:2)
Take a look at the shot of the close up of the car side - look under the front wheel and it just looks .... artificial.
Unless there's some magic sauce that can be sprinkled on this
Sure. It's called radiosity [google.com].
without added a frame rate hit
...oh
Re: (Score:2)
The core problem is that "raytracing" isn't a concrete thing or technology, its a general purpose technique. It can do pretty much anything from a few ugly shiny spheres to photo realistic rendering and rendering time might vary from fractions of a seconds to days. Just like good old polygonal graphics can do everything from basic stuff like Starfox to the latest Unreal Engine 3 tech demo or full photorealistic movies. Without any clear footage of actual games its really pointless to discuss the issue, espe
What's more horrific (Score:3)
no meat about cloud computing? (Score:2)
I rtfa, and its confusing. It started with talk of cloud computing on mobile devices (with no mention how the constant speedbump of network lag were to be overcome) and then droned on about a new chip architecture.
Nothing to see here, moving along...
stupid article (Score:1)
Fail (Score:1)
"A new technology from Intel called ray tracing "
I stopped here.
Re: (Score:2)
Then every year its another ray tracing demo usually jammed awkwardly into an ID tech engine.
What other engine do you recommend? What other major label releases engines of its five-year-old games under the GNU GPL?
I can create photoreal images on my netbook with a GPU that has 24 stream processors.
At what resolution and frame rate?
Re: (Score:2)
What other engine do you recommend? What other major label releases engines of its five-year-old games under the GNU GPL?
How about something new that they have written themselves? Whats the point of demonstrating the supposedly next big thing in computer games with some obsolete five year old game? How are they ever going to get games written for raytracing if they can't even find somebody to put together a solid tech demo?
Programmer art (Score:2)
How about something new that they have written themselves?
Intel is in the hardware business and possibly the driver business. Making parts of a video game other than code needs a different skill set; otherwise, you will likely end up with the phenomenon called "programmer art". It's far cheaper to start with a 5-year-old Id game than to hire a producer and competent artists to come up with an original setting.
This sounds familiar (Score:2)
Isn't this what Sony promised us the Playstation 3 would do, and the supposed reason why they went with the "Cell" processor? Because everything Sony that had to do heavy graphics lifting would have one, and they would all cooperate to make your games better? And of course, this never came to pass, and Sony never really used Cells for anything else (ISTR there might have been a Cell-based Blu-Ray player that wasn't the PS3, but maybe that was just a rumor.)
Another silly cloud computing idea (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
You're joking, right?
All you need is a 10Gbps LAN. It's expensive, but it can be had today. It's most likely doable over plain gigabit with compression or a reduced framerate.
Gigabit is already can be had cheaply enough for home usage. 10G will get there eventually, certainly in a lot less than my remaining lifetime.
Re: (Score:1)
Re: (Score:2)
I can only see this working in LAN settings though.
The problem isn't with the video, it's the enormous CPU power required. Several very high end machines allocated to a single customer easily for hours, with all customers wanting to use it at about the same time, and at the price of a gaming service? I don't think it would work out at all.
This would be more useful for some sort of corporate/scientific visualization purpose maybe. For home usage I imagine video cards will get there fairly soon, especially if
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Commercial solutions already available! (Score:1)
h264 doesn't work, you need a low latency codec. Computing the motion compensation between N keyframes means you're introducing N frames of latency.
So you need to transfer still images, encoded in MJPEG or something similar but more advanced. Is it possible?
of course [microsoft.com] it is [virtualgl.org]! One solution was introduced recently with the windows SP1, the other one is open source and has been available for some years.
doing it from the cloud (i.e. fancy word for the internet) isn't so interesting, the technology sounds so much
Re: (Score:1)
You should take a look at onlive.com It's exactly that kind of "cloud" gaming, and it exists right now.
Re: (Score:1)
Re: (Score:1)
Just signed up for a free trial to test it... Yeah it's blurry, but on a shitty DSL connection, the latency was better than I expected.
But it probably won't go anywhere.
uh huh (Score:2)
When tablet/mobile data plans won't be insanely expensive and when broadband will have no upload/download limits and decent speed, then you can start talking to me about rendering graphics in the "cloud".
Re: (Score:1)
I think the notion that games have to be any one thing is preposterous.
article about visual stuff w/ no images. (Score:2)
how great would it be if /. automatically filtered stories which are about imagery but do not in fact have images in them.
"cloud" makes sense? (Score:2)
Is this an actual example of a good usage of the term "cloud"? In the sense of some computers out there somewhere doing stuff for you and you getting the results? Not long ago I heard about the company OnLive and their cloud-based gaming, where all the computing and rendering is done on their servers, you send your control inputs across the net to them and they send you back sound and video.
Played it not long ago myself and expected the lag to be bad, but it turned out it wasn't bad after all. You can se
Re: (Score:1)
Re: (Score:2)
NTSC? Those players are using thin pipes, huh?
Sounds like you haven't tried it. Give it a go, there's a free trial. It's easy to run. I'm curious to hear how you like it.
I've got a 20 Mbps connection and I'm probably close to one of their servers, so I haven't had network issues at all. You should only be running NTSC rates if you've got a 1.5 Mbps connection. That's well below average these days, isn't it?
Network speeds keep improving (google "bandwidth over time"). This stuff will only keep getting
In number of images (Score:2)
At 60Hz, one screen refresh is every 16ms, so the rendering takes either 8 to 14 images with 5 images caused by the network RTT..
Interesting.