NVIDIA RTX Technology To Usher In Real-Time Ray Tracing Holy Grail of Gaming Graphics (hothardware.com) 159
HotHardware writes: NVIDIA has been dabbling in real-time ray tracing for over a decade. However, the company just introduced NVIDIA RTX, which is its latest effort to deliver real-time ray tracing to game developers and content creators for implementation in actual game engines. Historically, the computational horsepower to perform real-time ray tracing has been too great to be practical in actual games, but NVIDIA hopes to change that with its new Volta GPU architecture and the help of Microsoft's new DirectX Raytracing (DXR) API enhancements. Ray tracing is a method by which images are enhanced by tracing rays or paths of light as they bounce in and around an object (or objects) in a scene. Under optimum conditions, ray tracing delivers photorealistic imagery with shadows that are correctly cast; water effects that show proper reflections and coloring; and scenes that are cast with realistic lighting effects. NVIDIA RTX is a combination of software (the company's Gameworks SDK, now with ray tracing support), and next generation GPU hardware. NVIDIA notes its Volta architecture has specific hardware support for real-time ray tracing, including offload via its Tensor core engines. To show what's possible with the technology, developers including Epic, 4A Games and Remedy Entertainment will be showcasing their own game engine demonstrations this week at the Game Developers Conference. NVIDIA expects the ramp to be slow at first, but believes eventually most game developers will adopt real-time ray tracing in the future.
Microsoft, really? (Score:5, Insightful)
quote: "and the help of Microsoft's new DirectX Raytracing (DXR) API enhancements."
There's a red flag. Is this going to be yet another graphics "standard" for Windows only?
Re: (Score:3, Interesting)
Probably. But, honestly, that's where the gaming market is anyway.
NVIDIA wants to put out cool products, but I doubt they start off giving a crap about Linux and other platforms.
Re: (Score:3, Insightful)
Consoles (Score:1)
the only OS for gaming
beside PC compatible running Windows, there is this other small thing in a corner also used to play games.
It's called "consoles"
(Surprise: It's actually a sizeable market. You might want to gloat that consoles are dead and PC/Windows killed them. But in practice its still a market that bring a lot of cash in).
Except for a few studio that only produce exclusives, a technology that is only available on a single platform (e.g.: PC/Windows) is shutting away a lot of potential profits from other platforms.
You ca
Re:Consoles (Score:4, Informative)
Consoles will be relevant in the discussion when they launch a console that will support this tech. Until then it's a PC ray-traced world.
Re: (Score:2)
Sure there are some games for Mac and Linux but be honest, PC Gaming is a Windows world and will be for the foreseeable future.
Re: (Score:2)
Seriously, I don't really give a crap about which OS my gaming console runs. The only difference is that my gaming console happens to be a "PC".
Re: (Score:2)
Currently, 199 of my 442 games on Steam run on Linux. Many of them AAA titles.
Sure, less than half of my entire library runs on it... but that's a far cry from "some" like it used to be.
OpenGL (Score:2)
As opposed to OpenGL where you had the ATI version, the Nvidia version, and going further back the Solaris and IRIX versions.
Re: (Score:2)
...really, Really?? (Score:2)
Re: (Score:3)
quote: "and the help of Microsoft's new DirectX Raytracing (DXR) API enhancements."
There's a red flag. Is this going to be yet another graphics "standard" for Windows only?
Of course not. It will cover XBox too and with it the majority of the gaming market.
No Progress (Score:2, Interesting)
Zobeid says, "there must be no progress except on my terms! No Progress I say!!"
Here's a clue. This is an Nvidia technology. OMG, they left AMD out!
In a world where companies bring value-added and proprietary technologies to the table, this is what happens. Making technologies universal and commodities happens through competition.
If you wait for standards committees, cooperative ventures, FOSS, Vulkan, and everyone to get their shit together, progress takes years longer and sometimes stops entirely. Is
Re: (Score:2)
Re: (Score:1, Insightful)
1). Feel free to walk down the hall and create that open standard. In the meantime Nvidia and Microsoft have already done it. Maybe this is the kick in the creative ass you need;
2). The benefit is to the customers. You are acting like this is a proxy battle over which board member gets elected. Whoever does this, the customers benefit;
3). You literally added nothing to the GP comment.
FOSS fanatics don't get it. Since there is no practical way to create binary compatibility between Linux, OSX, Window
Re: (Score:2)
If it works? no, it's too big of an achievement. Some else will create their own API.
Re: (Score:2)
Nvidia writes the directX standards. MS rubber stamps it while Nvidia already has it in the hardware to screw over AMD/ATI. It has been like this for awhile
yep, ssh,bash,remote desktop (Score:1)
Oh yeah, so many Linux features we had 10 years before windoze.
In 1/5th the memory space.
Who did 24bit color first, not VGA Windows shit in the 80s.
Re: (Score:2, Troll)
ROTFL. So, Mr. Troll, that means that you were bopping around on multiple desktops, using remote logins and graphical applications on one machine displayed on another, way back there in the late 80's and early 90's (when these were all developed features in Unix-based operating systems and Windows was a thin shell, stupid shell on top of DOS trying to compete with Apple's GUI)? Features that were in Linux almost from day one? I'd go down the list of things that were in the early Linux distros, such as SL
Re: (Score:1)
Re: (Score:2)
/. is about achieving something? When did that memo come out? Damn, and here I thought it was all about rants, flames, trolling... and curiously, mooing MOO cow MOO. And a rare (fortunately, my eyeballs are still burning) goatse. And for the record, I try very hard not to participate in meetings to discuss progress and status...:-)
Re: Microsoft, really? (Score:1)
Re: (Score:1)
I think you are confusing Linux with Unix.
You missed the point. Those "cloned features" existed long before Windows was even an OS. GNU's Not Unix was designed using Unix as a model, not by looking at anything Microsoft was doing in its graphical DOS shell.
kernel panics, crashes, and lack of software combined with lack of drivers and poor documentation
Well, it's better than Windows. [xkcd.com]
Re: Microsoft, really? (Score:1)
Windows 10 still doesnâ(TM)t support network drives or real userspace. At this point in time i doubt anyone is using windows for much more than writing word / powerpoint documents and a bit of light hearted internet browsing.
microsoft web browsers hitting our web servers is down to sub 1% of users.
raytracing and ar is too little too late imho. Not because they wont be ok, but simply because competant developers wont touch the platform with a bargepole.
Re: Microsoft, really? (Score:2)
pc gaming market is not a significant computer market anymore, it used to make up some 90% of computing equipment and software sales, now i doubt it makes up more than 1%. consoles (xbox, ps4, nintentdo) and mobile devices are where all the developers are.
Most games developers are then using a compatability layer which targets all platforms and which are unlikely to devote any reasonable dev resources to anything windows specific. even hololens is struggling to find developers and that is an order of magnit
Re: (Score:2)
disagree on the VR.
VR is not compute intensive, despite the oculus hype. the headsets require less gpu than split screen or dual monitors. nasa have had decent vr headsets since the 1980s.
The current sticking point with vr is all the tricks they use like bump mapping, various lighting techniques and a load of volumetric fill do not work in true 3d.
Knocking the visual look back to the late 1990s. (because those lovely 3d bumps look like flat sprites)
Raytrace is the only way that can move forward.
And it will.
Re: (Score:2)
the headsets require less gpu than split screen or dual monitors.
Huh? 1080p x 2 @ 90fps is required for good VR... It requires a pretty beefy card. The split screen/dual monitors statement makes no sense at all. Fill rate requirement is doubled, along with geometry requirements... It absolutely is more GPU intensive than any non-VR application, period.
Next, do you even own a headset? I do.
Frankly, it looks fucking amazing. Like "you can't even imagine it until you've tried it" amazing.
I haven't shown it off to a single friend who didn't take that headset off grinning,
Re: Microsoft, really? (Score:1)
What about Microsoft Bob?!
Re: Been there, done it (Score:1)
Your eyes only collect already traced rays. The sun raytraces everything in the solar system.
Computer raytracing actually works backwards - rays are cast from the perspective of the camera, and those that end up at a light source contribute to the image.
No thanks, involves Windows 10 (Score:4, Insightful)
Re: (Score:2)
Re: (Score:2)
Those furry fetish sites you love to visit probably spy on you a lot more than MS does.
They do, but I am not forced to run them 24/7 at Ring 0 privilege.
Re: (Score:3, Insightful)
> that doesn't spy on me.
It's a good thing you're posting this via snail mail from a compound in the desert then.
I'm betting that if we ever get a full look at the scope of all the online spying that goes on with people's every day internet use, Windows 10's telemetry won't even be in the top 100 of data harvesting schemes to worry about.
Re: (Score:2)
> that doesn't spy on me.
It's a good thing you're posting this via snail mail from a compound in the desert then.
No, but I tightly control what is disclosed in my /. posts. No such luck with Win10.
I'm betting that if we ever get a full look at the scope of all the online spying that goes on with people's every day internet use, Windows 10's telemetry won't even be in the top 100 of data harvesting schemes to worry about.
There are 100s of murders a day nationwide. So we shouldn't worry about someone burglarizing your place until all of these other crimes are solved, right?
Re: (Score:2)
> No, but I tightly control what is disclosed in my /. posts. No such luck with Win10.
I'm sure you do *in the post*, but do you really know what's leaking from your browser when you simply visit ./ ? Ever run Wireshark to look? You might be a bit surprised...
> There are 100s of murders a day nationwide. So we shouldn't worry about someone burglarizing your place until all of these other crimes are solved, right?
No, what I am suggesting is that even if there are local burglaries you shouldn't sit at
Re: (Score:3)
> No, but I tightly control what is disclosed in my /. posts. No such luck with Win10.
I'm sure you do *in the post*, but do you really know what's leaking from your browser when you simply visit ./ ?
I do. I run Pale Moon + No Script and don't allow any kind of third-party plugins. So all /. knows about me is IP address and contents of my posts.
Re: (Score:2)
Unless whatever analytics they use simply tracks you by your browser's "fingerprint"...
https://panopticlick.eff.org/ [eff.org]
https://amiunique.org/ [amiunique.org]
Re: (Score:1)
Or, as said by XKCD, this: Licence Plate [xkcd.com]
Re: (Score:2)
Which is why internet here is only available through proxy, which only certain apps know about, and that does not include Windows.
It means Windows Update can't find anything, Telemetry can't send anything, Cortana doesn't work, Tiles donot update, etc.
It just amazes me people can live with a computer that doesn't do exactly what you tell it to.
Re: (Score:2)
I keep a windoze box just for stuff I can't do on Linux. I don't connect it to the internet, I never do any business on it, it's just for a handful of programs such as the programmer for my car's computer. You could build a PeeCee just for gaming and it wouldn't matter that they were spying, there'd be nothing for them to see.
Re: (Score:1)
Re: (Score:3)
No thanks. I wills tick to gaming on Windows 7, that doesn't spy on me.
By the time this gets to market you will be using Windows 7 with so many unpatched holes and bugs, EVERYONE will be spying on you.
Re: (Score:2)
Running 8 year old versions of windows is like hopping out of an airplane and saying aloud "well so far so good".
Re: (Score:2)
Re: (Score:2)
You do realize that 1) Microsoft listened to consumers and made the telemetry data very easy to disable in Windows 10, and 2) that very same telemetry data collection and reporting was already back-ported and pushed as an "update" to Windows 7? Also 3) it is similar telemetry data that is collected by other OSes like Android and iOS, plus applications like Firefox and Chrome (where do you think they get the stats for X% of users do Y with our product in their reports?)
Re: (Score:3)
You do realize that 1) Microsoft listened to consumers and made the telemetry data very easy to disable in Windows 10
Only in Enterprise version. Consumer versions resist disabling this feature to the point that OS disregards registry settings and bypasses its own internal firewall.
2) that very same telemetry data collection and reporting was already back-ported and pushed as an "update" to Windows 7?
Yes, but you can block specific patches and there exist a known list of them.
Also 3) it is similar telemetry data that is collected by other OSes like Android and iOS, plus applications like Firefox and Chrome (where do you think they get the stats for X% of users do Y with our product in their reports?)
Yes, every commercial OS went to shit insofar as privacy. Even some Linux distros spy on you. This doesn't mean you have to accept this.
Re: (Score:2)
They backported all the spyware to Windows 7 a year ago. Where have you been?
No windows OS is safe at this point if you don't want Microsoft monitoring you.
Re: (Score:2)
How do you know windows 7 doesn't spy on you? Not that it has to because your internet foot print is huge, like everyone else's.
Whistling In The Dark. (Score:2)
No thanks. I wills tick to gaming on Windows 7, that doesn't spy on me.
Developers target platforms with significant market share and mainstream graphical support. Mainstream support for Win 7 as an OS ended in 2015. OEM Win 7 system installs for the consumer market in 2014. Four years is a long time in this business.
Re: (Score:2)
Raytracing does not produce photorealistic images (Score:2, Informative)
Ray tracing is great for specular (not spectacular...) reflections, i.e. light interacting with mirror-like, non-diffusing surfaces. It produces highlights, (perfect) refraction, (perfect) reflections and hard shadows. Anything else is not the domain of ray tracing. You can have fuzzy effects with ray tracing, but they come at an extreme processing power cost. Some effects are practically impossible to calculate with ray tracing. Ray tracing can contribute a small part of the rendering equation (the specula
Re: (Score:2)
No. While at first glance the video demo (https://www.youtube.com/watch?v=70W2aFr5-Xk) is very pretty it's clear they've gone overboard to make everything they possibly can reflective. That in itself defeats the attempt to be photorealistic.
Aside from the ultra-reflectivity of everything in the scene the most annoying glitch on display was the bursty behaviour of the blooms on the metal edges of the tables, evident because they've used a (relatively) low-poly mesh for something that should be smooth curves.
Re: (Score:3)
It's got rays.
More GPU and CPU and that will be perfect for every type of surface in a computer game.
The need for more extreme processing is what will grow GPU and CPU sales.
Re: (Score:2)
Ray-tracing doesn't seem necessary, games have impressive looking lighting already, they've gotten good at faking it. I'm skeptical about real time ray-tracing being able to handle high polygon counts and be able to output at 4k.
Re: (Score:2)
Microsoft will help along with the high polygon counts and 4K game marketing.
The must have games and GPU thats ray-tracing 1.0 ready.
Re: (Score:2)
No, I'm saying the tech probably isn't there yet, I doubt they can do real time hi-res ray-tracing on consumer level hardware.
Re: (Score:2)
Just one more must have new selling point to hype for consumer level hardware.
Using Graphics cards for actual games? Wow!!! (Score:5, Insightful)
Re: (Score:3)
Re: (Score:2)
Yeah! I can't wait to see how much crypto-currencies I can mine at once with these new ray-tracing GPUs!
Just kidding. All I got is my gaming PC with a single GPU that mines when I'm not playing games.
Re: (Score:2)
The run of the mill for the past few years is that graphics cards are for mining the cryptocurrency flavour of the month and creating magical AI bots. This is the first time in years I have seen an article that refers to the use of graphics cards for actual graphics.
My guess is that the graphics companies are seeing that cryptocurrencies may be peaking or on the decline. Between various countries banning them, municipalities banning them or charging more for electricity, and people starting to wise up that many cryptocurrencies are scams, the writing may be on the wall. AMD and Nvidia may be seeing a dropoff in sales, they would be the first to know if cryptocurrencies have peaked.
Reminds me of a paper form Intel some years ago (Score:3)
Tracing Rays Through the Cloud [intel.com] is a pretty good example of what was "next-gen" 6 years ago. None of the imagery there was generated real-time (just read the paper), but was still a good read about what goes into ray tracing. Intuitively we know what it is, but what it means for computation with reflective/refractive surfaces is a ton of work.
Of course, I won't believe it's real-time until it can render a house of mirrors at 60fps+.
Re: (Score:2)
In the old days, all we had was 320x200 in 16 colours and we were happy with that.
Re: (Score:2)
Ah but we used to break those boundaries as well! I was on a demo crew who played with overscan - cf. http://aldabase.com/atari-st-f... [aldabase.com]
Re: (Score:2)
We also developed the SPX viewer which allowed for 'lots' of colours! I seem to recall that much of the imagery was pr0n. But we were teenagers.
Link to an actual video... (Score:1)
...can be found here:
https://www.youtube.com/watch?v=jkhBlmKtEAk [youtube.com]
Looks quite impressive even without the post filter in my opinion.
Re: (Score:1)
It really depends on how you're using ray tracing! If the rays you're tracing arise as you're sampling together a solution to the rendering equation (so that the rays are recursive, or chained together), glossy surfaces are your arch-nemesis and will create bright "fireflies" all over the screen that you have to resort to very advanced path tracing methods to try to minimize. For example, if you're tracing a ray towards a diffuse wall that then bounces towards an extremely specular object in such a way that
why directX? (Score:1)
What's wrong (to NVIDIA's eyes) with OpenGL?
Re: (Score:2)
Re: (Score:2)
No, you imbecile. You focus on Intellivision, Colecovision and Atari 7800.
Re: (Score:1)
Personally, I don't think this sort of thing really justifies the cost of raytracing when current techniques work fairly well, but nVidia clearly disagree.
Re: (Score:2)
Easy AMD can use that. The dirty secret is Nvidia wrote DirectX11 and already had a GPU with the code in hardware already to beat AMD or I should say ATI at the time.
Nvidia owns directX as much as MS and need a closed standard to monopolize the market.
Re: (Score:2)
Didn't Vulkan get destroyed on stardate 2258.42?
Raytracers are pretty fun... (Score:5, Interesting)
When I was in college, I took two semesters of graphics - but this was in the late DOS era. Early OpenGL existed, but because this was a real theoretical college class on graphics - we built a real raytracer from pure math from c-code and assembler rather than trying to stick to some arbitrary industry standard.
Cubes, spheres, torus, lighting, reflections, we did it all, piece by piece in glorious 640x350. It was ugly, and eerie, but really fascinating in terms of seeing pure mathematical expressions becoming 3d objects, pixel by pixel.
Since then, I've worked in several jobs frequently involving 'proper' graphics, even worked on a bunch of professional shipped games (mostly gameplay and systems, occasionally worked everywhere though) - and I can appreciate the need to use all the tricks that we do to make origami worlds, everything angled to the camera, but I really did enjoy creating worlds of actual objects, and having the camera pull its own shell of perspective out of the scene instead.
Which is how most assets are sort of created, actually, in the asset creation tools. You model the object, rip the polygons out how you can, create meshes and surfaces, and then try and cheat on everything to make it seem like the 'real' object again as cheaply as you can get away with. It's not quite raytracing outside a few tools, but it's an interesting hybrid.
Raytracers are a cool educational tool - but I can also see why they're only really trotted out when CPU manufacturers want to push for a race to buy more CPUs. They don't scale as well as modern techniques - and although there's some neat tricks you can do when you have your assets really 'present' mathematically (Demoscene stuff does this occasionally), it's usually not a better tradeoff than using the abstraction tools available to make it all work faster.
Ryan Fenton
Re: (Score:2)
Wait a minute... 640x350? You did raytracing in 16 colours?! Yikes.
Re: (Score:1)
Ever seen a life-sized ASCII-art nude printed on wide-carriage, green-bar paper and hung up on a wall?
We made do with what we had. :-P
We used to dream of 640x350, because they hadn't even invented that yet. The first porn I saw on the internet was a 320x200 interlaced animated GIF ... and it was monochrome. Most monitors didn't even have actual colours.
Why, we used to have to debug our code on paper printouts by hand walking through the co
Re: (Score:2)
640x350 16 color EGA graphics as an option - 1984
Just because you didn't see stuff until 5 years after it was commercially available doesn't mean the rest of us didn't.
Re: (Score:2)
My first computer was a Colour Computer 2 with 64KB and tape drive. Floppy drives were incredibly expensive.
My first modem was 300bps. After that I went to 2400bps, then 14.4kbps and finally 28.8kbps.
My first PC was a 8086 running at 8MHz with 256KB.
From some parts of your comment, you started before me (wide-carriage green-bar paper printers).
From other parts of your comment, you started after me (AFAIK 320x200 in monochrome did not exist, you either had monochrome Hercules graphics or four-colours CGA, th
Re: (Score:3)
The tricks played to make things look real have been very convincing and took up less power.
But go look at a teardown of a single scene in GTA V
http://www.adriancourreges.com... [adriancourreges.com]
"All in all there were 4155 draw calls, 1113 textures involved and 88 render targets."
And a lot of clever trickery that engine etc. programmers have to apply, texture artists have to take account of, etc. etc etc.
The "shortcuts" give convincing near-realism on low hardware for a hefty development price.
Ray-tracing gives convincing re
Awesome! (Score:1)
Platform (Score:3)
Yeah, you have to love the graphic towards the bottom:
"Board Industry Support"
API: Microsoft.
That's it. The only option. Not very "broad".
Re: (Score:2)
It is, on the other hand, very board.
Intel already tried this (Score:2)
Intel was trying to push this when it was clear they weren't making headway in the GPU space and also to push a heavier reliance on CPUs over GPUs (or at least in conjunction with) but it never seemed to gain any traction and was just relegated to tech demos.
https://www.geek.com/games/int... [geek.com]
https://www.hpcwire.com/2010/0... [hpcwire.com]
I guess we'll see how Nvidia does.
Re: (Score:2)
I guess that's why the article says the Intel already tried this, and mentions reasons why this is different. Including the decade of technology improvements since then. I wasn't sure why they would mention that.
Likely far in the Future (Score:2)
Between both major graphics card manufacturers (AMD) and (NVIDIA) it's usually not uncommon for either to work directly with Microsoft to introduce new DirectX features. The new Vulcan type rendering engines for example are major contributions by AMD. Unfortunately when it comes to raytracing, the main issue is that the amount of processing required for anything better than a simple scene is too much to run in realtime for just about any modern system. If there's glassy / refractive objects, the amount o
Texture quality? (Score:2)
Why is it that every demonstration of ray tracing results in every surface looking like velvet? The secular reflections from small bumps in the textures are just insane. Is it because they were dialed up on purpose or is it some effect of raytracing that needs to be fixed with something like anti-aliasing?
Re: (Score:2)
Re: (Score:2)
Thanks, you suspected correctly. It was the metallic surfaces in the video. When the camera gets close you can see they are textured and cease to sparkle uncontrollably. It's a shame because while the video looked good in principle showing off the wonderful light reflections, I think I have seen some far more realistic looking footage in traditional game engines.
12 years later ... (Score:2)
I remember around 2006 (when AMD has just acquired ATI ?) Intel was making a lot of noise of running the graphics directly on the CPU, hence the GPU-less machines was their big prediction.
They were mentioning real-time ray-tracing as the next big thing in graphics and their CPUs were obviously the natural thing to do it. Here is an example from 2007:
https://www.youtube.com/watch?... [youtube.com]
AMD and Nvidia immediately pointed out that their GPUs were much better for this job, and then nothing happened. Now I see they
Really? (Score:2)
I actually found that demo video pretty unimpressive.
https://www.youtube.com/watch?... [youtube.com] to link directly.
Oh sure, it's PRETTY but there are some odd artifacts:
- the table edges at 0:08+ flicker oddly
- 0:47 the light effect from the source looks hemispherical, but the device itself wouldn't be?
- the coffe cup with saucer at 1:08 has a weird glowy base
These may be explicable, but they seemed odd in a tech demo.
Not the Holy Grail (Score:3)
Turns out raytracing isn't the holy grail of gaming graphics, although it's been hyped for so long that it seems like it. I always thought Pixar films were raytraced, but they were actually rasterized. Cars was their first film that used raytracing at all, and even then it was only during the big race (due to all the reflections, presumably). I do know that shows like Babylon 5 and I believe ST:TNG did use raytracing, though. Nvidia shows off 'realtime raytracing' every few years but it never takes off, presumably better overall results are still achieved via rasterization; sure, you can get sexy shadows and reflections, but your poly count will be at early PS3-era levels. Also, there are problems with raytracing and meshes that animate, like, say, humans, that make it much slower. This is why you almost always see it done with static meshes like cars or buildings. Turns out raytracing isn't even the ultimate rendering technology; Path Tracing [wikipedia.org] is closer, if not theoretically perfect.
It's also worth noting that a form of raytracing has been in use in realtime graphics for a while, called relief mapping [wikipedia.org], which has made it into games.
Re: (Score:2)
Realtime dynamic radiosity rendering is nice - it's used a lot in gaming for developing static patches.
Re: (Score:2)
This is pretty https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
"And as always they showcase the technology with a scene that would have rendered just as well with current rasterizer+lightmap techniques that people have gotten used to over the last 20 years."
Yes, and 20 years ago it would probably have taken a couple of hours to render a single frame on a top spec PC, not 60fps real time you clueless gimp.