Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI Graphics Games

Nvidia's AI-Powered Scaling Makes Old Games Look Better Without a Huge Performance Hit (theverge.com) 41

Nvidia's latest game-ready driver includes a tool that could let you improve the image quality of games that your graphics card can easily run, alongside optimizations for the new God of War PC port. The Verge reports: The tech is called Deep Learning Dynamic Super Resolution, or DLDSR, and Nvidia says you can use it to make "most games" look sharper by running them at a higher resolution than your monitor natively supports. DLDSR builds on Nvidia's Dynamic Super Resolution tech, which has been around for years. Essentially, regular old DSR renders a game at a higher resolution than your monitor can handle and then downscales it to your monitor's native resolution. This leads to an image with better sharpness but usually comes with a dip in performance (you are asking your GPU to do more work, after all). So, for instance, if you had a graphics card capable of running a game at 4K but only had a 1440p monitor, you could use DSR to get a boost in clarity.

DLDSR takes the same concept and incorporates AI that can also work to enhance the image. According to Nvidia, this means you can upscale less (and therefore lose less performance) while still getting similar image quality improvements. In real numbers, Nvidia claims you'll get image quality similar to running at four times the resolution using DSR with only 2.25 times the resolution with DLDSR. Nvidia gives an example using 2017's Prey: Digital Deluxe running on a 1080p monitor: 4x DSR runs at 108 FPS, while 2.25x DLDSR is getting 143 FPS, only two frames per second slower than running at native 1080p.

This discussion has been archived. No new comments can be posted.

Nvidia's AI-Powered Scaling Makes Old Games Look Better Without a Huge Performance Hit

Comments Filter:
  • Stop posting ads /.
    • Re: (Score:3, Insightful)

      New products [slashdot.org] are both news, and ads... So where do we draw the line?
    • Re:Feh (Score:5, Insightful)

      by thegarbz ( 1787294 ) on Saturday January 15, 2022 @06:38AM (#62174701)

      What ad? Do you see a product named? A price? All I see is a discussion about a technology being introduced to help gaming visual quality and performance.

      But if you insist I'll return you to your regular slashdot programming:
      - The election was stolen.
      - Biden is ruining America
      - Ivermectin + VitaminD is the cure to COVID.

      You happy now?

      • Yes, this is an ad. It's what Nvidia does, fake tech news that sounds amazing but isn't. It's press release verbatim from a for profit company about a product they make that you can buy. FFS learn what marketing is
        • Maybe you should learn what "press" release means. Also what "fake" means. Fuck man learn what "marketing" is (hint, all ads are marketing, not all marketing is ads).

          Holy shit I just realised you didn't manage to write a single sentence without misunderstanding a word. You should get yourself a dictionary.

  • by AmiMoJo ( 196126 ) on Saturday January 15, 2022 @06:10AM (#62174675) Homepage Journal

    AMD and Intel are both scrambling to catch up here. Nvidia developed AI cores for its GPUs early, when everyone thought it was just a fad and wouldn't produce anything useful. Now they have the best scaling in the industry buy a wide margin, and nobody else has the silicon to match it.

    • Intel doesn't even have a dGPU on the market yet. Arc won't hit the desktop scene until April at the earliest. AMD is selling every dGPU they make. DLSS isn't really changing anything.

      Yeah NV is taking market share from AMD, but that has more to do with AMD prioritizing elsewhere. AMD could probably double their sales of RDNA2 dGPUs if they were to prioritize making more.

    • This particular change in the NVidia driver is a repsonse to AMDs similar technology. So what we see here is NVidia's scramble. Not that you would know from this particular ad about it.

  • Shrug (Score:4, Insightful)

    by Kokuyo ( 549451 ) on Saturday January 15, 2022 @07:11AM (#62174723) Journal

    ...is what my GTX780 would do if it could read such news.

    There are no GPUs in my budget. Until that changes, things like this are basically academic factoids and nothing more.

    • Even if they're in your budget, it takes a certain amount of... something that I don't have to pay 200+% MSRP for one.
      • Exactly. I've been thinking about building a new PC but the thoughts always end with "The price of these GPUs is TOO DAMN HIGH".

        I can't be the only one who's specifically not upgrading because of the price of a new video card. I am still rocking the FX-8350 with 2x Zotac GTX 950 AMP! cards and yeah, it's a fucking moldy potato by today's standards but it still runs the games I want to play at reasonable quality. (If they support SLI then it's roughly equivalent to having a 970.) And for non-game purposes fr

        • You're definitely not the only one. My 12 year old PC is still humming along, doing just fine for anything but modern PC games. I'll think about getting a new PC when I don't have to pay $2000 for a decent GPU.

          My fear, though, is that Nvidia doesn't really have much competition among GPUs these days. Once they've jacked their prices up, what exactly is going to push them back down? They're already releasing older chipsets at inflated MSRPs, which they could not have gotten away with a few years ago. I

          • by BraxM ( 9258611 )
            I'm in the same boat with my 2500k + GTX660 Ti which is 10 years old at this point, been waiting to upgrade for 2 years now, and I don't see myself doing so until things settle down and prices at least come down to near MSRP levels (which are still too damn high compared to what the used to be). I could afford it in theory but I'd feel like I'm getting screwed and that does not sit well with me. I usually look for something that gives good value for the money spent which just does not seem to exist anymore.
    • things like this are basically academic factoids and nothing more

      Most articles about new technology are academic factoids and nothing more for the first several years of their release. That's no reason to dismiss them, though if you're happily running a GTX780 I doubt you are the target market for this technology.

    • by antdude ( 79039 )

      Ditto. I installed an used GTX 750 Ti (2 GB of VRAM) video card into my mostly new (finally upgraded its mobo, RAM, & CPU after a decade) PC. I could use its onboard Intel video, but that will suck for my rare PC gaming.

  • When will someone come up with a tool to make vinyls records sound like CDs?
  • there's a chip shortage, but honestly NVidia is gouging gamers. They really haven't done anything effective to fight the crypto-ponzi-scammer-jerks. Yes, I own quite a few Nvidia video cards for doing deeplearning, but at current prices it's a total rip off. I'm glad I bought rtx 2060 when it was affordable. right now, I have zero plans to buy any nvidia video cards until msrp is reasonable and you can get one near msrp.
  • by MrL0G1C ( 867445 ) on Saturday January 15, 2022 @09:59AM (#62174859) Journal

    I'd be a lot more interested if they could do something like this: https://www.youtube.com/watch?... [youtube.com] In real time (re-skins GTA using AI trained with real world imagery so that it looks real-ish). For anyone interested in what AI can do in the field of AI image manipulation and physics simulation I'd totally recommend subscribing to https://www.youtube.com/user/k... [youtube.com]

  • DLSS (Score:4, Interesting)

    by egr ( 932620 ) on Saturday January 15, 2022 @10:16AM (#62174867) Journal
    DLSS from Nvidia was a huge disappointment for me. I had high hopes for technology, but unfortunately what looks good in promotional screenshots is totally unusable in real games due to visual artefacts and ghosting. Hopefully this works better.
    • I have been ghosted a few times, but never by a video card. I really don't like where the technology is going those days!

    • DLSS from Nvidia was a huge disappointment for me. I had high hopes for technology, but unfortunately what looks good in promotional screenshots is totally unusable in real games due to visual artefacts and ghosting. Hopefully this works better.

      And when did you last use it? DLSS has changed a lot since it's early release (where it was absolute garbage). Specifically the recent release of DLSS 2.2 focused almost exclusively on eliminating ghosting. There are also plenty of examples of people rednecking updated DLSS versions into older games, such as copying the DLSS DLLs from Cyberpunk 2077 into Doom Eternal to address ghosting in that game.

      It was a huge disappointment when it came out to the point where I actively took the performance hit instead

      • by egr ( 932620 )
        Kind of funny how are you mixing FXAA, which is anti aliasing technology with DLSS which is upscaling. I have used DLSS last month in several games, Cyberpunk including, and the results are still disappointing. The effect is easily detect and quite irritating.
        • Kind of funny how are you mixing FXAA, which is anti aliasing technology with DLSS which is upscaling.

          I'm not mixing anything. You missed my point in the comparison. I could have used the basic fast square root function as an example instead of FXAA. The point is that most settings and algorithms in software that impact how things are drawn on screen are fixed in their implementation. FXAA has a clear mathematical algorithm behind it. You can trace what happens to output pixels from input pixels with the only adjustment being a couple of variables. Whereas DLSS has an "AI" model behind it which is constantl

          • by egr ( 932620 )

            I'm not mixing anything. You missed my point in the comparison. I could have used the basic fast square root function as an example instead of FXAA. The point is that most settings and algorithms in software that impact how things are drawn on screen are fixed in their implementation. FXAA has a clear mathematical algorithm behind it. You can trace what happens to output pixels from input pixels with the only adjustment being a couple of variables. Whereas DLSS has an "AI" model behind it which is constantly tweaked and changed between versions.

            Most algorithms are tweaked throughout their lives, AA is no exception. Even FXAA has many implementations.

            More or less disappointing than a frame rate increase the equivalent of literally buying new hardware 2 generations in the future? Just remember what the effect here is. A bit of ghosting to save literally a thousand dollars in video hardware. Maybe you and I just have a different threshold at where we are disappointed.

            Depends what are you trying to achieve, higher numbers in benchmarks for marketing or actually comparable quality. In competitive online gaming those effects really get to you. You start seeing things where there is nothing. Don't get me wrong, I really want this technology to succeed, but so far it is under-preforming in my opinion.

  • Maybe it is because I am color blind. But looking at the single image sample provided in the article, I can not really see a difference when zoomed in. Lines may be slightly crisper, but that's all I can really see.

    Did I miss more examples of the technology? Because as is, label me un-impressed.

    • It's becauae they forgot to make it "look sharper by running them at a higher resolution than your monitor natively supports." Which makes your monitor display nothing, because it's at a higher resolution than it supports.
  • How well will it "enhance" my "enjoyment" of Leisure Suit Larry 1?
    Keep in mind that the old AGI games only run at 160x200 pixels.

  • I've been playing through Torchlight 2 at 1440p, and its antialiasing is kind of mediocre, the image is still pretty aliased. So I took to using DSR at 4X (the only DSR factor today that looks good with 0% smoothing), and that solved my problem (nice and smooth now), but even in a game as old as Torchlight 2, rendering at 5120x2880 on a 144Hz monitor is a big ask, and the game often drops well below max refresh rate.

    I gave DLDSR a shot, and found that both the 1.78x and 2.25x scaling factors (with 0% smooth

  • In my case, installing the new version (on Legion 7i gaming laptop with AMD) results in a black screen after login, requiring a safe-mode boot to revert to a prior driver. Probably a limited case but suggest anyone that's trying this out backup the existing driver or take a system restore point before upgrading.

    Have also noticed that when upgrading through GeForce Experience it will reboot with no warning when it's done - save your stuff before you start!

  • Background: I have an EVGA 3080Ti (so I am in the target market for this technology), my monitor is 3840x2160 at 120Hz with GSync (variable refresh rate technology). My top priority when setting up a game is getting 120fps most of the time with only brief and occasional dips into the 90s. If the GPU can't handle that at 2160p at max graphics settings I tend to reduce the resolution before the graphics settings, to my eyes a 1440p image at max settings looks better than a 2160p image at reduced settings.

  • It seems ironic that the example picture in the linked article does not improve anything.

  • And the upscaling done with ANN.

    So what is new?

One person's error is another person's data.

Working...