NVIDIA Unveils GRID Servers, Tegra 4 SoC and Project SHIELD Mobile Gaming Device 109
MojoKid writes "NVIDIA made some bold moves at their CES 2013 press conference and announced a couple of potentially game changing products. GeForce GRID is a cloud gaming solution. It allows PC game content to be run and rendered in the cloud and then streamed to any device that can run the GRID receiver utility, like a Smart TV, tablet, or a smartphone. GeForce GRID server architecture combines an NVIDIA-designed server packed with GPUs with NVIDIA-developed software and virtualization layer. A rack of 20 GRID servers was shown, powered by 240 GPUs, capable of 200 TFLOPS and roughly equivalent to the performance of 720 Xbox 360 consoles. The biggest news to come out of NVIDIA's press conference, however, had to do with Tegra 4. Not only was the next-gen SoC officially unveiled, but a new portable gaming device based on Tegra 4, dubbed Project SHIELD, was also demoed. NVIDIA's Tegra 4 builds upon the success of the Tegra 3 by incorporating updated ARM15-based CPU cores with 72 custom GeForce GPU cores, which offer up to 6x the performance of Tegra 3. The A15 cores used in Tegra 4 are up to 2.6x faster than the A9-class cores used in Tegra 3. As a companion to the Tegra 4, NVIDIA also took the wraps off of their new Icera i500 programmable 4G LTE modem processor. Icera i500 features 8 custom, programmable processor cores and is approximately 40% smaller than many fixed function modems. The biggest surprise to come out of NVIDIA's press conference was Project SHIELD, a Tegra 4-powered mobile gaming device running Android that's sure to put Microsoft, Sony, and Nintendo on high alert. Project SHIELD offers a pure Android experience without any skinning or other customizations, save for the SHIELD app environment, that can play any Android game. Project SHIELD has the ability to stream PC games from a GeForce GTX-equipped PC as well. The device is shaped much like an Xbox 360 game controller, but features a 5", flip-out capacitive touch display with a 720P resolution. The device can also stream to an HD TV via HDMI or a WiDi-like wireless dongle. In fact, CEO Jen-Hsun Huang showed Project SHIELD playing a 4K video on an LG 4K TV."
Game controller with Display (Score:3, Interesting)
Would be an excellent part for the upcoming console generation.
I miss the old Dreamcast controller with its LCD display in it.
A modern take on that would have a nice 5" touchscreen LCD with built-in GPU.
It would be expensive as hell of course...
Re: (Score:1)
WiiU renders on the console, not on the controller.
Re: (Score:1)
and when it renders on the console, it becomes GPU bound, limiting the number of controllers that can handle it. (WiiU only allows 1 controller with a display)
The old Dreamcast was great at multiplayer because each controller would have its own display, letting each player pick plays and so on secretly from the others...
Re: (Score:1)
Your virginity is astounding.
Re: (Score:2)
Correction.WiiU can do 2 controllers, but there's no game that has any use for that, nor will there be for quite some time.
So no football titles are planned for it? Bummer. As GP mentioned, DC was great for those games because of the personal screen on it.
Not the first android game device, (Score:2)
And there is the Ouya which was mentioned here on slashdot recently.
I can't help but wonder if the android hardware game device market is about to get really crowded.
Re: (Score:2)
This is a game changer that we have been dreaming (Score:3)
This is the system of our wet dreams. For years we have talked about cloud gaming devices. And in theory internet speeds are fast enough to make this work.
This is actually very interesting. How will Sony, Microsoft and the consoles compete with this? Could this thing be used to bring back arcade gaming? I could see arcades coming back with something like this here.
Re: (Score:1)
And in theory internet speeds are fast enough to make this work.
Internet speeds aren't fast enough to make this work, not just bandwidth, but also latency.
Local ping times are far too high to be usable. Local Verizon FIOS has 25ms pings across the city, nevermind across the country.
You need sub 3ms ping times.
Re: (Score:2)
How can you have such a completely different experience from my own? I tried Gaikai at work during lunch breaks. Our offices are on top of a datacentre and we have a few 10Gbps direct links to LINX (london internet exchange). I don't think gaming connectivity gets any better than what I have there. Yet Gaikai SUCKED donkey balls every single time I tried it. I absolutely hated the latency and never played for more than 15 minutes before I got annoyed with it.
Re: (Score:2)
Yep, and remember that the (imaginary) market for this system would be people with tablets connected over 4G or home WiFi.
Nobody with a desktop PC would be simultaneously:
a) Rich enough to subsidize the hardware investment needed by the service providers to make this work (monthly fee...$50?)
and
b) Too poor to buy a $150 graphics card (which is what this hardware will equate to in six months time).
Re: (Score:2)
Local ping times are far too high to be usable.
I remember seeing marketing materials from that cloud gaming company, the large one whose name I forget.
They had some lovely graphs about latency. They'd managed to get the rendering and compression latency way down (impressive, but probably possible).
They's also apparently done the same to the network latency. Given that all this supposedly happened in the data centre it smelled like a herd of bulls had recently wandered through.
Re: (Score:3)
Local ping times are far too high to be usable.
I remember seeing marketing materials from that cloud gaming company, the large one whose name I forget.
They had some lovely graphs about latency. They'd managed to get the rendering and compression latency way down (impressive, but probably possible).
They's also apparently done the same to the network latency. Given that all this supposedly happened in the data centre it smelled like a herd of bulls had recently wandered through.
Onlive. Their offerings were so great that they went bankrupt.
Re: (Score:2)
Onlive. Their offerings were so great that they went bankrupt.
Ah yes. Thanks. That's exactly the bunch of scam artists I was referring to.
I say scam because they basically made shit up about the biggest problem which was network latency. Oh and usually quoted the one way latency, not the ping time, which is much more significant in gaming.
Re: (Score:1)
I'm not sure what Verizon is doing wrong, but certainly something. I'm using cable in Finland (DNA Welho). Ping to regional exchange point (FICIX) is roughly 7ms, ping to Rovaniemi (~1000km) on different operator is roughly 25ms, ping first hit in Germany (Level3 Dusseldorf) is 44ms.
as long as it's wifi only streaming 3g / 4g data c (Score:2)
as long as it's wifi only streaming 3g / 4g data cost will run up fast and don't even think of roaming as say 1-2 hours of that can cost more then a buying a high end gameing laptop.
Re: (Score:2)
3g/4g latency will make the device unplayable. No need to worry about running up the bill, you would be far to frustrated to play for more than a couple minutes.
Re: (Score:2)
WTF is the point of streaming the entire game rather than the cloud just taking care of the back end
nvidia is going down the toilet with apple and qualcomm killing their business so they had to make up this crazy idea to stay relevant
Re:This is a game changer that we have been dreami (Score:5, Insightful)
This is the system of our wet dreams. For years we have talked about cloud gaming devices. And in theory internet speeds are fast enough to make this work.
Who is this "we"? I dream of having a handheld device that runs games. I want it to work where wireless coverage is spotty or nonexistent, without that a portable device is worse than useless, it's a rock I have to drag around because it's expensive.
Re: (Score:2)
Who is this "we"?
I think he means NVIDIA - NVIDIA have been hoping for ages somebody will be stupid enough to actually try this.
Same thing with Intel when they were trying to find a buyer for their Larrabee chips - real-time raytracing for cloud games! Only a billion dollars of Intel chips and motherboards needed!!
Re: (Score:2)
Unless the server is very close to you, the latency between button press and action is too high. I have tried onlive and similar they all suffer from this. For some game styles it is fine, but anything with fast action makes this an exercise in frustration. The speed of light is a real bitch.
Re: (Score:1)
Re: (Score:2)
I know this can easily be expanded into a fallacy, but just think about what they could have accomplished by spending all those hours practicing a real martial art. /off to conquer the world
Re: (Score:2)
It is connected to a Geforce650 or better via HDMI. So that aspect is pretty gimmicky.
And IMHO for your portable gaming needs you'd be better served by a full tablet + a PS2 controller.
This is a tech demo and I do not expect it to go into production anytime soon. nVidia are trying to get more OEMs to buy their Tegra4 SoC. And honestly an Asus Transformer with this baby makes me actually pretty happy in the pants.
Tegra4's party piece is that it can display different stuff via HDM
Re: (Score:2)
These days home consoles are pretty close to general-purpose computing boxes, and could quite easily be used as clients for cloud-gaming services just as they're currently used as clients for cloud-video services. Sony recently bought Gaikai after all, they've got to be doing something with it. (Rumour has it, all their back compatibility...)
The 1980s called (Score:5, Insightful)
They want their X Windows back.
Re: (Score:3)
HEY!
First you wanted to be able to run remote desktop.
Then the user complained of how slow it was and only 1% (of something) used that feature.
Now, as we are working on to get rid of it, you want that feature back before we got rid of it!?
Make up our mind!
Re: (Score:2)
First you wanted to be able to run remote desktop.
Then the user complained of how slow it was and only 1% (of something) used that feature.
Actually, over a decent LAN (Intranet, and now even with broadband around the world) running remote apps over X has worked reasonably well.
Back when I was at Boeing, I used to manage machines all over the Puget Sound region this way (20 to 30 miles between sites). And we had Windows NT with an X client front end, so people with *NIX systems could open a Windows desktop on the rare occasion that some manager e-mailed out a Word Doc with macros* and you absolutely had to run Word to see it.
It was Microsoft'
Re: (Score:2)
Yeah, I remember playing 1280x720 3D games across X-Windows at 60FPS in the 80s.
Nobody did that.
But, in the 90s, I remember extensions for X that allowed streaming compressed video from client to server and utilizing the server display hardware to do the rendering. The same could be done from a graphics intensive game on the client to display hardware with a powerful GPU.
Nothing terribly novel to see here.
Wow (Score:2)
That is one long paragraph.
Re: (Score:2)
Re: (Score:2)
NVIDIA is hoping somebody else will pay for that. Right after they buy 100 million $ of NVIDIA chips and find out the "Latency isn't a problem!" thing was just a rigged demo.
Re: (Score:2)
Nothing.
Forget updating everyone else's infrastructure even if it was all multimode fiber this is a nonstarter due to the speed of light. Unless the server is very close, under 280 miles at about the absolute maximum you can forget about this. That would give you a 3ms latency on each button press assuming there is no latency other than that induced by the speed of light.
Re: (Score:2)
What makes GRID any better than OnLive? [wikipedia.org] Specifically in regards to latency, is the lag reduced between controller input and display? Unless nVidia is prepared to upgrade everyone else's infrastructure, I don't see this taking off.
It doesn't do a thing to solve the (significant) 'even customers in the same city have shitty ping, and we can't usefully load-balance our datacenters because adding a cross-country fiber trip totally ruins things, so we have to provision for peak getting-home-from-school/work-and-playing-games time; but let most of it sit idle during the day' problems that helped doom Onlive.
It probably is much better placed than Onlive was to fix the "We basically need an entire computer, or a VM with dedicated hardware p
also need data centers close to major citys (Score:2)
also need data centers close to major city's to keep ping times down so you need more then one to cover the main land usa.
AK and HI may need there own as well.
Re: (Score:2)
May?
They absolutely would have too. A state the size of AK would need several. HI is also pretty well spread about, but maybe it could be just a few since the islands are in a line. Assuming no latency other than that incurred by the speed of light, and for a spherical cow, 280miles is the maximum for a 3ms latency. That is being very generous and still looks like pretty much an impossible task, unless you only allow some zipcodes to sign up for service to avoid having pissed off customers.
Re: (Score:2)
may as in price higher or not available in AK or HI
Re: (Score:2)
Re: (Score:2)
EVERYTHING is turn-based. Its called server tickrate. Just because its fast enough for you to not feel it doesnt mean its not there.
Re: (Score:2)
Pedant.
SHIELD's screen directly over the buttons? (Score:3)
Humm from my experience with some laptops, usually its not a good idea to have a screen directly over keys (or buttons). I've seen a couple of screens with "marks" from the keys, although this might be due to the quality or age of the laptops in question... Perhaps the SHIELD has the buttons below the level of the screen when closed? :)
I'm nitpicking, I know, but that was the first thing that crossed my mind while looking at the pictures
more stupidity (Score:4, Insightful)
Re: (Score:2)
It will probably stream OpenGL commands, not rendered images.
Doubtful.
If you can stream one rendered image every 16ms you can display 60fps. If you send OpenGL commands you will be better-off on a lot of frames but it will stutter every time you need to send a texture larger than a rendered frame (and most textures are larger than a rendered frame).
The game is to make the peak frame latency 16ms (or whatever your target is), not to reduce the average.
Re: (Score:1)
Wouldn't it make sense then to have some data (textures etc) locally cached, and just send the scene/model info?
Re: (Score:2)
Re: (Score:1)
The post a few up suggested they might be streaming OpenGL commands. My suggestion was that it could possibly be done if the textures or similar large data-blobs had a local cache.
Re: (Score:2)
Read further into it, and you effectively get three options in working with it.
1. Run your android things on it, entirely locally, with a quality controller
2. Use your LOCAL PC to stream stuff to it that then feeds the tv.
3. Use the cloud.
Personally, I'm more interested in number one more than anything else. and I'm disappointed that not a single thread on here has went "this is a neat hand-held console"
Yes, you can stream cloud stuff on it, big deal, it has better uses, lets look at those.
Re: (Score:3)
Re: (Score:2)
Re: (Score:3)
It will probably stream OpenGL commands, not rendered images.
that could very easily be very counter productive to the task at hand - data needed to create the image could fairly easily be much bigger that way than just streaming the images in modern games. and you might just as well play the entire game on the handheld then..
They're probably encoding on gpu though thus the minimum gpu requirements on the pc..
anyways, it's not that onlive doesn't work. it works if you have a fast connection.. of course locally it would be better still. but onlive type of system is pre
Re: (Score:2)
I could have sworn there was another company that recently discovered that you cannot in fact stream 1920x1080 uncompressed over a standard internet connection and went bankrupt. Now Nvidia is trying to do it?
No, NVIDIA's just fishing for somebody who's stupid enough to go bankrupt from buying NVIDIA chips.
Re: (Score:2)
Nvidia has solved the old problem of not wanting to sit at your PC to play games on a 27" screen but on a 10" screen while sitting on the couch. Wait what?
The HDMI streaming thing is indeed bit
Re: (Score:2)
An Android portable gaming device makes actually a lot of sense. As much as a Gameboy, PSP and so on.
More, in fact, for a variety of reasons. But it has to be small and cheap and nVidia will want a nice big screen. A small tablet (MID-sized) that can dock to a controller would make almost infinitely more sense, ala the Transformer series. I, however, think that Asus should do it, and nVidia should stick to making chips.
Re: (Score:2)
Tegra4 can show a different screen on the tablet than it does via HDMI. My Prime already is a nice business/gaming/allrounder and I'm just waiting for a proper justification to ditch it for a new shiny. Which hopefully doesn't crawl to a shameful halt while it does any kind of meaningf
depends on the game (Score:4, Insightful)
Maybe this isn't a solution for FPS games, but I would love to be able to play Civilization V from the cloud with all the graphic bells and whistles.
Re: (Score:2)
nVidia has a lot of money in hardware video compression. Throwing their chips in with cloud gaming is a low-cost bet for a company that's going to be developing the technology for other applications.
Uses Kepler's h.264 encoder (Score:2)
The streaming video is of course compressed, by the required GeForce 650+ GPU which has dedicated low-latency h.264 encoding hardware. The Shield unit supports 2x2 MIMO 802.11n, which should be more than capable.
nVidia haven't given any latency figures, but hands-on reports all indicate "no detectable lag" over local connections. Some mention visible encoding artefacts, so the bitrate used may not be very high.
I need realtime ray tracing. (Score:1)
Lemme know when you can stream a 4k render (as in 4096i) to my house with a 50ms latency and reaction time.
Re: (Score:2)
Lemme know when you can stream a 4k render (as in 4096i) to my house with a 50ms latency and reaction time.
50 ms network latency can be unplayable. The local hardware, display, and controller will add another 50-75 ms of latency on top of that. They already do with consoles. If playing on a computer with a good monitor instead of a television this might be cut down to 25 ms, in which case a 25-50 ms latency from the network might be acceptable overhead.
Additionally, connections don't maintain constant latency. A 50 ms latency will jump to 200 ms (or in my ISP's case, 5 s) every few minutes. If you've ever p
Re: (Score:2)
the best part of this, is the waiting (Score:4, Insightful)
geeks love to wait for crap and nvidia is the master of a paper launch
expect this to hit stores by august 31 to and in the mean time geeks will be creaming their shorts reading blog posts about how awesome this and watching unboxing videos made by the PR guys
Re: (Score:2)
Re: (Score:1)
Fucking ATI shill!
Re: (Score:2)
Exactly. I remember exact same hype surrounding previous Tegras.
Re: (Score:2)
i've bought nvidia cards since the 1990's TNT2 and they always pull this crap
one time they announced a card and they hadn't taped out the GPU to TSMC yet
9 month wait for retail means the final product is not ready and they don't have a manufacturing deal set up yet. for all the hype of the Tegra 4, the LTE modem is not ready yet so don't expect phones for it until next year at the earliest. by then the apple A8 will be out
Let's hope it is more efficient than Tegra 3 (Score:2)
According to a study of power efficiency focused in tablets with different CPUs (nVidia Tegra3, Qualcomm Krait APQ8060A, Samsung Exynos Cortex A15) from anandtech.com ( http://www.anandtech.com/show/6536/arm-vs-x86-the-real-showdown [anandtech.com] and http://www.anandtech.com/show/6529/busting-the-x86-power-myth-indepth-clover-trail-power-analysis [anandtech.com] ), nVidia Tegra3 is less efficient than Intel Clovertrail platform:
* Intel Clovertrail vs nVidia 3: "Ultimately I don't know that this data really changes what we already knew a
Re: (Score:3, Informative)
NVIDIA say that Tegra 4 is 45% more efficient power-wise than Tegra 3. Some of this will be down to its 28nm process rather than the 40nm process Tegra 3 utilised. They also say it is 2.6x faster.
In addition the AnandTech articles are all set up by Intel, so you need to take the results with a large pinch of salt.
It does raise some questions about Samsung's 32nm process, although a large amount of the power consumption of the Exynos 5250 could be the GPU rather than the CPU - the Exynos uses a very high pe
720 Xbox 360? (Score:2)
Re: (Score:1)
the test you are referring to was done with tegra 3
Re:Tegra *3* vs Intel Medfield (Score:1)
Old 40nm design pitted against latest Intel 32nm design.
Intel design doesn't impress outright.
Was this the one done with Windows RT, that cannot make use of Tegra 3's low power companion core? Yes, yes it is.
Why such a low resolution display? (Score:2)
With 1080p 5" android phones being all the rage, why bother with 720p?
Re: (Score:2)
With 1080p 5" android phones being all the rage, why bother with 720p?
they cost and are hard to source currently(one affecting the other) .
Re: (Score:2)
Re: (Score:2)
Don't all the 1080p Android phones currently have terrible performance and battery life?
Re: (Score:1)
Re: (Score:1)
Nvidia confusing people again. (Score:4, Informative)
Nvidia makes the mistake of thinking its audience might be informed- big mistake.
At this conference, they showed a variety of services, which should be understood individually.
1) The proprietary GRID server rack system. The difference with Nvidia's solution is that the rack includes a butt-load of low-to-mid end Nvidia graphics chips. These are a solution looking for a problem. Nvidia foolishly suggested cloud gaming (rendering on the server- images set via the internet to the clients)- a model that has already crashed and burnt 1.5 times (Onlive went bust, Gaikai gained little traction and was sold). Worse, Nvidia's weak GPU solution is not good for processing the AAA games that people love on their PCs.
2) The launch of the Tegra 4- a high end ARM SoC, with a very high end GPU (graphics).
3) The launch of a reference design for a mobile Android gaming device based on the Tegra 4
4) The launch of a 'streaming' technology that allows PC games to be rendered on the desktop, and then wirelessly transmitted to a hand-held Android device, allowing the tablet to 'run' even the most powerful PC games. Of course, Nvidia was saying that their service would be proprietary, requiring specific Nvida graphics cards in the PC, and a Tegra 4 mobile device.
The new Nintendo Wii U does the same thing, using AMD/ATI technology. Third party apps already exist allowing you to hack current PC games and send their output to generic Android devices.
As one might imagine, the problem is simply one of real time video encoding (for the game output), and then playing back the video-stream on the Android device. Meanwhile, input is gathered on the Android device, and transmitted back to the PC to 'control' the game. It is obvious that such software methods will work at their best when built into the drivers on the PC, and this is what Nvidia is offering.
5) The Tegra 4 is revealed to be extremely power-hungry when all its processing units are being thrashed. No surprise here. The new paradigm for high-end ARM chips is parts that can go from extremely low power usage all the way up to power profiles usually associated with notebooks- and I mean in the same chip. We are actually close to mains-powered desktop ARM parts (which will easily rival Intel on a performance per chip cost basis). The market is demanding that the high-end mobile parts can achieve ever higher performance figures, regardless of the impact on battery life.
6) Nvidia showed various 'soft' smart-TV like functions by using the Tegra 4 as input for a 4K TV. Here we see the growing logic of using an external Android device as the heart of 'smart' TVs, rather than relying on the dreadful proprietary hardware/software solution the TV manufactures build into the TV itself.
So, in conclusion, Nvidia was really just releasing its latest ARM SoC, the Tegra 4. This part will go up against a lot of competition, and is a risky bet. Nvidia really needs the market to value Tegra's unique functions, and this really only means Nvidia's new GPU cores. Unfortunately for Nvidia, the power of their graphics can only be unleashed in mobile devices with very substantial battery capacity, and proper cooling- nothing like your current cheap tablet ecosystem. If Tegra 4 is placed into an ordinary tablet design or phone, the chip will have to be choked to a fraction of its potential, else the battery will last less than 1 hour, and the device will get very hot indeed.
The Tegra 4 may represent Nvidia giving up on the old Android mobile market, and focusing instead on applications that can provide more power, and dissipate more heat, like set-top boxes, and chunky hand-held gaming devices.
Re: (Score:1)
Re: (Score:1)
240 GPUs, 200TFLOP... (Score:1)
nice (Score:1)
Epic fail in gaming (Score:2)
nVidia should be embarrassed to have released this "game console".
It has to be about the shittiest design for any game controller/portable game platform ever. Tacking on a folding screen to a game controller hasn't been seen since the 1990's, and this device is the functional equivalent of the Atari joystick you could plug into your TV and play 1 of 50 games they used to sell in mall kiosks a few years back. Has nVidia even seen the PS Vita. Sleek, well integrated screen built into a controller, not a sc
Discount T-Shirt,belt,hat,sunglasses sale (Score:1)