Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Microsoft Entertainment Games Technology

NVIDIA RTX Technology To Usher In Real-Time Ray Tracing Holy Grail of Gaming Graphics (hothardware.com) 159

HotHardware writes: NVIDIA has been dabbling in real-time ray tracing for over a decade. However, the company just introduced NVIDIA RTX, which is its latest effort to deliver real-time ray tracing to game developers and content creators for implementation in actual game engines. Historically, the computational horsepower to perform real-time ray tracing has been too great to be practical in actual games, but NVIDIA hopes to change that with its new Volta GPU architecture and the help of Microsoft's new DirectX Raytracing (DXR) API enhancements. Ray tracing is a method by which images are enhanced by tracing rays or paths of light as they bounce in and around an object (or objects) in a scene. Under optimum conditions, ray tracing delivers photorealistic imagery with shadows that are correctly cast; water effects that show proper reflections and coloring; and scenes that are cast with realistic lighting effects. NVIDIA RTX is a combination of software (the company's Gameworks SDK, now with ray tracing support), and next generation GPU hardware. NVIDIA notes its Volta architecture has specific hardware support for real-time ray tracing, including offload via its Tensor core engines. To show what's possible with the technology, developers including Epic, 4A Games and Remedy Entertainment will be showcasing their own game engine demonstrations this week at the Game Developers Conference. NVIDIA expects the ramp to be slow at first, but believes eventually most game developers will adopt real-time ray tracing in the future.
This discussion has been archived. No new comments can be posted.

NVIDIA RTX Technology To Usher In Real-Time Ray Tracing Holy Grail of Gaming Graphics

Comments Filter:
  • Microsoft, really? (Score:5, Insightful)

    by Zobeid ( 314469 ) on Tuesday March 20, 2018 @09:02AM (#56290189)

    quote: "and the help of Microsoft's new DirectX Raytracing (DXR) API enhancements."

    There's a red flag. Is this going to be yet another graphics "standard" for Windows only?

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      There's a red flag. Is this going to be yet another graphics "standard" for Windows only?

      Probably. But, honestly, that's where the gaming market is anyway.

      NVIDIA wants to put out cool products, but I doubt they start off giving a crap about Linux and other platforms.

    • Re: (Score:3, Insightful)

      by Holi ( 250190 )
      You mean the only OS for gaming is going to support a tech that makes games look better? The horrors!
      • the only OS for gaming

        beside PC compatible running Windows, there is this other small thing in a corner also used to play games.
        It's called "consoles"
        (Surprise: It's actually a sizeable market. You might want to gloat that consoles are dead and PC/Windows killed them. But in practice its still a market that bring a lot of cash in).

        Except for a few studio that only produce exclusives, a technology that is only available on a single platform (e.g.: PC/Windows) is shutting away a lot of potential profits from other platforms.

        You ca

    • As opposed to OpenGL where you had the ATI version, the Nvidia version, and going further back the Solaris and IRIX versions.

    • My first thought was that Nvidia is going to have a monopoly on the technology and I feel even less comfortable with that than if it were just Windows only. Hopefully though if this catches on, or maybe even if it doesn't, AMD will come to the rescue.
    • I'm an MCSE who detests Windows and uses Linux exclusively... and even I think your post is do you think MS should be releasing extensions to graphics API's for competing platforms??
    • quote: "and the help of Microsoft's new DirectX Raytracing (DXR) API enhancements."

      There's a red flag. Is this going to be yet another graphics "standard" for Windows only?

      Of course not. It will cover XBox too and with it the majority of the gaming market.

    • No Progress (Score:2, Interesting)

      by Anonymous Coward

      Zobeid says, "there must be no progress except on my terms! No Progress I say!!"

      Here's a clue. This is an Nvidia technology. OMG, they left AMD out!

      In a world where companies bring value-added and proprietary technologies to the table, this is what happens. Making technologies universal and commodities happens through competition.

      If you wait for standards committees, cooperative ventures, FOSS, Vulkan, and everyone to get their shit together, progress takes years longer and sometimes stops entirely. Is

      • Comment removed based on user account deletion
    • by geekoid ( 135745 )

      If it works? no, it's too big of an achievement. Some else will create their own API.

    • Nvidia writes the directX standards. MS rubber stamps it while Nvidia already has it in the hardware to screw over AMD/ATI. It has been like this for awhile

  • by sinij ( 911942 ) on Tuesday March 20, 2018 @09:20AM (#56290269)
    On one hand this technology is very exciting for any PC gamer. On other hand, MS locked new DirectX to Windows 10. As such, if you want this or that new feature enabled you could only do that on Win10. No thanks. I wills tick to gaming on Windows 7, that doesn't spy on me.
    • Those furry fetish sites you love to visit probably spy on you a lot more than MS does.
      • by sinij ( 911942 )

        Those furry fetish sites you love to visit probably spy on you a lot more than MS does.

        They do, but I am not forced to run them 24/7 at Ring 0 privilege.

    • Re: (Score:3, Insightful)

      by barc0001 ( 173002 )

      > that doesn't spy on me.

      It's a good thing you're posting this via snail mail from a compound in the desert then.

      I'm betting that if we ever get a full look at the scope of all the online spying that goes on with people's every day internet use, Windows 10's telemetry won't even be in the top 100 of data harvesting schemes to worry about.

      • by sinij ( 911942 )

        > that doesn't spy on me.

        It's a good thing you're posting this via snail mail from a compound in the desert then.

        No, but I tightly control what is disclosed in my /. posts. No such luck with Win10.

        I'm betting that if we ever get a full look at the scope of all the online spying that goes on with people's every day internet use, Windows 10's telemetry won't even be in the top 100 of data harvesting schemes to worry about.

        There are 100s of murders a day nationwide. So we shouldn't worry about someone burglarizing your place until all of these other crimes are solved, right?

        • > No, but I tightly control what is disclosed in my /. posts. No such luck with Win10.

          I'm sure you do *in the post*, but do you really know what's leaking from your browser when you simply visit ./ ? Ever run Wireshark to look? You might be a bit surprised...

          > There are 100s of murders a day nationwide. So we shouldn't worry about someone burglarizing your place until all of these other crimes are solved, right?

          No, what I am suggesting is that even if there are local burglaries you shouldn't sit at

      • by swilver ( 617741 )

        Which is why internet here is only available through proxy, which only certain apps know about, and that does not include Windows.

        It means Windows Update can't find anything, Telemetry can't send anything, Cortana doesn't work, Tiles donot update, etc.

        It just amazes me people can live with a computer that doesn't do exactly what you tell it to.

    • by amiga3D ( 567632 )

      I keep a windoze box just for stuff I can't do on Linux. I don't connect it to the internet, I never do any business on it, it's just for a handful of programs such as the programmer for my car's computer. You could build a PeeCee just for gaming and it wouldn't matter that they were spying, there'd be nothing for them to see.

    • I agree. I'm done upgrading Windows. I'll use Win7 and all my GOG games till the end of time. I already have more games than I'll every be able to play. However, the level of spying-asshole-acceptance I've got to reach in order to "upgrade" to Windows 10 is just too high. I'll check out realtime raytracing when it hits the PS5 or whatever... or not. Gameplay always trumps graphics anyway. If I need a real OS, I've got BSD. No EULA required.
    • No thanks. I wills tick to gaming on Windows 7, that doesn't spy on me.

      By the time this gets to market you will be using Windows 7 with so many unpatched holes and bugs, EVERYONE will be spying on you.

    • Running 8 year old versions of windows is like hopping out of an airplane and saying aloud "well so far so good".

    • by darkain ( 749283 )

      You do realize that 1) Microsoft listened to consumers and made the telemetry data very easy to disable in Windows 10, and 2) that very same telemetry data collection and reporting was already back-ported and pushed as an "update" to Windows 7? Also 3) it is similar telemetry data that is collected by other OSes like Android and iOS, plus applications like Firefox and Chrome (where do you think they get the stats for X% of users do Y with our product in their reports?)

      • by sinij ( 911942 )

        You do realize that 1) Microsoft listened to consumers and made the telemetry data very easy to disable in Windows 10

        Only in Enterprise version. Consumer versions resist disabling this feature to the point that OS disregards registry settings and bypasses its own internal firewall.

        2) that very same telemetry data collection and reporting was already back-ported and pushed as an "update" to Windows 7?

        Yes, but you can block specific patches and there exist a known list of them.

        Also 3) it is similar telemetry data that is collected by other OSes like Android and iOS, plus applications like Firefox and Chrome (where do you think they get the stats for X% of users do Y with our product in their reports?)

        Yes, every commercial OS went to shit insofar as privacy. Even some Linux distros spy on you. This doesn't mean you have to accept this.

    • They backported all the spyware to Windows 7 a year ago. Where have you been?

      No windows OS is safe at this point if you don't want Microsoft monitoring you.

    • by geekoid ( 135745 )

      How do you know windows 7 doesn't spy on you? Not that it has to because your internet foot print is huge, like everyone else's.

    • No thanks. I wills tick to gaming on Windows 7, that doesn't spy on me.

      Developers target platforms with significant market share and mainstream graphical support. Mainstream support for Win 7 as an OS ended in 2015. OEM Win 7 system installs for the consumer market in 2014. Four years is a long time in this business.

  • by Anonymous Coward

    Ray tracing is great for specular (not spectacular...) reflections, i.e. light interacting with mirror-like, non-diffusing surfaces. It produces highlights, (perfect) refraction, (perfect) reflections and hard shadows. Anything else is not the domain of ray tracing. You can have fuzzy effects with ray tracing, but they come at an extreme processing power cost. Some effects are practically impossible to calculate with ray tracing. Ray tracing can contribute a small part of the rendering equation (the specula

    • by AHuxley ( 892839 )
      Ray tracing's got what games crave.
      It's got rays.
      More GPU and CPU and that will be perfect for every type of surface in a computer game.
      The need for more extreme processing is what will grow GPU and CPU sales.
      • by MrL0G1C ( 867445 )

        Ray-tracing doesn't seem necessary, games have impressive looking lighting already, they've gotten good at faking it. I'm skeptical about real time ray-tracing being able to handle high polygon counts and be able to output at 4k.

        • by AHuxley ( 892839 )
          The need to drive up GPU and CPU sales is necessary.
          Microsoft will help along with the high polygon counts and 4K game marketing.
          The must have games and GPU thats ray-tracing 1.0 ready.
          • by MrL0G1C ( 867445 )

            No, I'm saying the tech probably isn't there yet, I doubt they can do real time hi-res ray-tracing on consumer level hardware.

            • by AHuxley ( 892839 )
              But the need to buy a new GPU just in case a new game needs that support will drive up sales.
              Just one more must have new selling point to hype for consumer level hardware.
  • by ickleberry ( 864871 ) <web@pineapple.vg> on Tuesday March 20, 2018 @09:27AM (#56290297) Homepage
    The run of the mill for the past few years is that graphics cards are for mining the cryptocurrency flavour of the month and creating magical AI bots. This is the first time in years I have seen an article that refers to the use of graphics cards for actual graphics.
    • That's because governments and GAFA have started a global crackdown on cryptocurrencies. NVIDIA strategists - as the smart bunch they are - feel the wind and repurpose their "tensor engine" for raytracing. That doesn't seem unnatural. At all.
    • Yeah! I can't wait to see how much crypto-currencies I can mine at once with these new ray-tracing GPUs!

      Just kidding. All I got is my gaming PC with a single GPU that mines when I'm not playing games.

    • by dj245 ( 732906 )

      The run of the mill for the past few years is that graphics cards are for mining the cryptocurrency flavour of the month and creating magical AI bots. This is the first time in years I have seen an article that refers to the use of graphics cards for actual graphics.

      My guess is that the graphics companies are seeing that cryptocurrencies may be peaking or on the decline. Between various countries banning them, municipalities banning them or charging more for electricity, and people starting to wise up that many cryptocurrencies are scams, the writing may be on the wall. AMD and Nvidia may be seeing a dropoff in sales, they would be the first to know if cryptocurrencies have peaked.

  • by ausekilis ( 1513635 ) on Tuesday March 20, 2018 @09:33AM (#56290333)

    Tracing Rays Through the Cloud [intel.com] is a pretty good example of what was "next-gen" 6 years ago. None of the imagery there was generated real-time (just read the paper), but was still a good read about what goes into ray tracing. Intuitively we know what it is, but what it means for computation with reflective/refractive surfaces is a ton of work.

    Of course, I won't believe it's real-time until it can render a house of mirrors at 60fps+.

  • by Anonymous Coward

    ...can be found here:
    https://www.youtube.com/watch?v=jkhBlmKtEAk [youtube.com]

    Looks quite impressive even without the post filter in my opinion.

  • by Anonymous Coward

    What's wrong (to NVIDIA's eyes) with OpenGL?

    • by Holi ( 250190 )
      Because when you want to sell video cards for games, you focus on the technology and platform where the games are.
    • It only does triangle rendering and texturing. This is pretty useful, but not for getting realistic refractive effects. Reflections in OpenGL tend to be a bit of a cheat as well. The reflection looks fine for the most part, but unless you have a perfectly flat, or perfectly spherical mirror, they're typically an approximation.

      Personally, I don't think this sort of thing really justifies the cost of raytracing when current techniques work fairly well, but nVidia clearly disagree.
    • Easy AMD can use that. The dirty secret is Nvidia wrote DirectX11 and already had a GPU with the code in hardware already to beat AMD or I should say ATI at the time.

      Nvidia owns directX as much as MS and need a closed standard to monopolize the market.

  • by RyanFenton ( 230700 ) on Tuesday March 20, 2018 @09:44AM (#56290411)

    When I was in college, I took two semesters of graphics - but this was in the late DOS era. Early OpenGL existed, but because this was a real theoretical college class on graphics - we built a real raytracer from pure math from c-code and assembler rather than trying to stick to some arbitrary industry standard.

    Cubes, spheres, torus, lighting, reflections, we did it all, piece by piece in glorious 640x350. It was ugly, and eerie, but really fascinating in terms of seeing pure mathematical expressions becoming 3d objects, pixel by pixel.

    Since then, I've worked in several jobs frequently involving 'proper' graphics, even worked on a bunch of professional shipped games (mostly gameplay and systems, occasionally worked everywhere though) - and I can appreciate the need to use all the tricks that we do to make origami worlds, everything angled to the camera, but I really did enjoy creating worlds of actual objects, and having the camera pull its own shell of perspective out of the scene instead.

    Which is how most assets are sort of created, actually, in the asset creation tools. You model the object, rip the polygons out how you can, create meshes and surfaces, and then try and cheat on everything to make it seem like the 'real' object again as cheaply as you can get away with. It's not quite raytracing outside a few tools, but it's an interesting hybrid.

    Raytracers are a cool educational tool - but I can also see why they're only really trotted out when CPU manufacturers want to push for a race to buy more CPUs. They don't scale as well as modern techniques - and although there's some neat tricks you can do when you have your assets really 'present' mathematically (Demoscene stuff does this occasionally), it's usually not a better tradeoff than using the abstraction tools available to make it all work faster.

    Ryan Fenton

    • Wait a minute... 640x350? You did raytracing in 16 colours?! Yikes.

      • by Anonymous Coward

        Wait a minute... 640x350? You did raytracing in 16 colours?! Yikes.

        Ever seen a life-sized ASCII-art nude printed on wide-carriage, green-bar paper and hung up on a wall?

        We made do with what we had. :-P

        We used to dream of 640x350, because they hadn't even invented that yet. The first porn I saw on the internet was a 320x200 interlaced animated GIF ... and it was monochrome. Most monitors didn't even have actual colours.

        Why, we used to have to debug our code on paper printouts by hand walking through the co

        • animated GIF - 1989 (part of the GIF89a extension to the GIF87a format)
          640x350 16 color EGA graphics as an option - 1984
          Just because you didn't see stuff until 5 years after it was commercially available doesn't mean the rest of us didn't.
        • My first computer was a Colour Computer 2 with 64KB and tape drive. Floppy drives were incredibly expensive.
          My first modem was 300bps. After that I went to 2400bps, then 14.4kbps and finally 28.8kbps.
          My first PC was a 8086 running at 8MHz with 256KB.

          From some parts of your comment, you started before me (wide-carriage green-bar paper printers).
          From other parts of your comment, you started after me (AFAIK 320x200 in monochrome did not exist, you either had monochrome Hercules graphics or four-colours CGA, th

    • by ledow ( 319597 )

      The tricks played to make things look real have been very convincing and took up less power.

      But go look at a teardown of a single scene in GTA V

      http://www.adriancourreges.com... [adriancourreges.com]

      "All in all there were 4155 draw calls, 1113 textures involved and 88 render targets."

      And a lot of clever trickery that engine etc. programmers have to apply, texture artists have to take account of, etc. etc etc.

      The "shortcuts" give convincing near-realism on low hardware for a hefty development price.

      Ray-tracing gives convincing re

  • More features to turn off in the settings so I can improve frame rates and actually see what I'm supposed to be looking at.
  • by ledow ( 319597 ) on Tuesday March 20, 2018 @10:00AM (#56290503) Homepage

    Yeah, you have to love the graphic towards the bottom:

    "Board Industry Support"

    API: Microsoft.

    That's it. The only option. Not very "broad".

  • Intel was trying to push this when it was clear they weren't making headway in the GPU space and also to push a heavier reliance on CPUs over GPUs (or at least in conjunction with) but it never seemed to gain any traction and was just relegated to tech demos.

    https://www.geek.com/games/int... [geek.com]

    https://www.hpcwire.com/2010/0... [hpcwire.com]

    I guess we'll see how Nvidia does.

    • I guess that's why the article says the Intel already tried this, and mentions reasons why this is different. Including the decade of technology improvements since then. I wasn't sure why they would mention that.

  • Between both major graphics card manufacturers (AMD) and (NVIDIA) it's usually not uncommon for either to work directly with Microsoft to introduce new DirectX features. The new Vulcan type rendering engines for example are major contributions by AMD. Unfortunately when it comes to raytracing, the main issue is that the amount of processing required for anything better than a simple scene is too much to run in realtime for just about any modern system. If there's glassy / refractive objects, the amount o

  • Why is it that every demonstration of ray tracing results in every surface looking like velvet? The secular reflections from small bumps in the textures are just insane. Is it because they were dialed up on purpose or is it some effect of raytracing that needs to be fixed with something like anti-aliasing?

    • I suspect you're looking at some of the staples of raytracing texturing like simulated wood, metal, or stone. Heterogeneous materials approximated by too low of a resolution will end up "looking like velvet" because there's more color change going on per sampled unit of distance than can be made to look both smooth and accurate. Anti-aliasing is a crutch. Better alternatives: increase your color depth, increase the resolution of the image, or adjust the formulae being used to simulate the material.
      • Thanks, you suspected correctly. It was the metallic surfaces in the video. When the camera gets close you can see they are textured and cease to sparkle uncontrollably. It's a shame because while the video looked good in principle showing off the wonderful light reflections, I think I have seen some far more realistic looking footage in traditional game engines.

  • I remember around 2006 (when AMD has just acquired ATI ?) Intel was making a lot of noise of running the graphics directly on the CPU, hence the GPU-less machines was their big prediction.

    They were mentioning real-time ray-tracing as the next big thing in graphics and their CPUs were obviously the natural thing to do it. Here is an example from 2007:
    https://www.youtube.com/watch?... [youtube.com]

    AMD and Nvidia immediately pointed out that their GPUs were much better for this job, and then nothing happened. Now I see they

  • I actually found that demo video pretty unimpressive.

    https://www.youtube.com/watch?... [youtube.com] to link directly.

    Oh sure, it's PRETTY but there are some odd artifacts:
    - the table edges at 0:08+ flicker oddly
    - 0:47 the light effect from the source looks hemispherical, but the device itself wouldn't be?
    - the coffe cup with saucer at 1:08 has a weird glowy base

    These may be explicable, but they seemed odd in a tech demo.

  • by mentil ( 1748130 ) on Tuesday March 20, 2018 @02:27PM (#56292341)

    Turns out raytracing isn't the holy grail of gaming graphics, although it's been hyped for so long that it seems like it. I always thought Pixar films were raytraced, but they were actually rasterized. Cars was their first film that used raytracing at all, and even then it was only during the big race (due to all the reflections, presumably). I do know that shows like Babylon 5 and I believe ST:TNG did use raytracing, though. Nvidia shows off 'realtime raytracing' every few years but it never takes off, presumably better overall results are still achieved via rasterization; sure, you can get sexy shadows and reflections, but your poly count will be at early PS3-era levels. Also, there are problems with raytracing and meshes that animate, like, say, humans, that make it much slower. This is why you almost always see it done with static meshes like cars or buildings. Turns out raytracing isn't even the ultimate rendering technology; Path Tracing [wikipedia.org] is closer, if not theoretically perfect.

    It's also worth noting that a form of raytracing has been in use in realtime graphics for a while, called relief mapping [wikipedia.org], which has made it into games.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...