Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Games Technology

'Cyberpunk 2077' Finally Shows What DLSS Is Good For (vice.com) 69

An anonymous reader shares a report: More recent Nvidia graphics cards have a proprietary feature called Deep Learning Super Sampling (DLSS), and while it's often been touted as a powerful new rendering tool, the results have sometimes been underwhelming. Some of this is down to the oddly mixed-message around how DLSS was rolled-out: it only works on more recent Nvidia cards that are still near the cutting edge of PC graphics hardware⦠but DLSS is designed to render images at lower resolutions but display them as if they were rendered natively at a higher resolution. If you had just gotten a new Nvidia card and were excited to see what kind of framerates and detail levels it could sustain, what DLSS actually did sounded counterintuitive. Even games like Control, whose support of DLSS was especially praised, left me scratching my head about why I would want to use the feature. On my 4K TV, Control looked and ran identically well with and without DLSS, so why wouldn't I just max-out my native graphics settings instead rather than use a fancy upscaler? Intellectually, I understood the DLSS could produce similarly great looking images without taxing my hardware as much, but I neither fully believed it, nor had I seen a game where the performance gain was meaningful.

Cyberpunk 2077 converted me. DLSS is a miracle, and without it there's probably no way I would ever have been happy with my graphics settings or the game's performance. I have a pretty powerful video card, an RTX 2080 TI, but my CPU is an old i5 overclocked to about 3.9 GHz and it's a definite bottleneck on a lot of games. Without DLSS, Cyberpunk 2077 was very hard to get running smoothly. The busiest street scenes would look fine if I were in a static position, but a quick pan with my mouse would cause the whole world to stutter. If I was walking around Night City, I would get routine slow-downs. Likewise, sneaking around and picking off guards during encounters was all well and good but the minute the bullets started flying, with grenades exploding everywhere and positions changing rapidly, my framerate would crater to the point where the game verged on unplayable. To handle these peaks of activity, I had to lower my detail settings way below what I wanted, and what my hardware could support for about 80 percent of my time with the game. Without DLSS, I never found a balance I was totally happy with. The game neither looked particularly great, nor did it run very well. DLSS basically solved this problem for me. With it active, I could run Cyberpunk at max settings, with stable framerates in all but the busiest scenes.

This discussion has been archived. No new comments can be posted.

'Cyberpunk 2077' Finally Shows What DLSS Is Good For

Comments Filter:
  • by Holi ( 250190 ) on Friday December 11, 2020 @11:11AM (#60819474)
    It's not like many people will have systems that will run the game. Between the bugs and the lack of systems in gamers hands, CD Projekt Red is looking at a severely disappointing launch.
    • "Cyberpunk 2077 had some interesting glitches on launch, but those haven't dissuaded anyone from playing it. The title set a new record for the largest number of simultaneous players in a single player game, with a record 1,003,262 playing just after the December 10th launch, according to Steam Database. That tops the previous Steam record of 472,962 players set by Fallout 4 back in 2015."
      • No Man's Sky had over 200K at launch day, then it bombed.

        • NMS is doing just fine. They've released multiple patches with massive improvements to content and gameplay. The sales are just fine and by all accounts the patches bring it up to (more or less) what was promised at launch. The players seem more than happy with it. It's not my kind of game, but it didn't bomb. It's got almost 150k steam reviews, which indicates good sales.

          It didn't become a Fortnite level gaming event like the hype said, but it's had a _very_ long tail, often still selling for full pop
          • As someone who had preordered it, I have good knowledge of their history. It was hyped to Heaven and back, then launched, it was utter crap, bombed immediately, then it took over a year to (slowly) be reborn. I have been playing it on and off, and yes it did improve, but at launch and a year after it was a mess.

            • And now that it's finally close to what they promised, it's $30 on sale on Steam!

              Yet another reason I generally don't play any AAA games anymore, and if I do, I wait a few years before buying. Early adopters get screwed on just about everything.

            • I remember the last time this all happened. It was over a decade ago and it was called Spore. I stayed away from No Mans Sky for this very reason.
      • how meany GOG players not on steam

    • by thegarbz ( 1787294 ) on Friday December 11, 2020 @12:01PM (#60819616)

      Disappointing is not the word I'd use to describe a game with 8 million presales, and 3.7 million launch day sales. "Fastest selling of all time" may be a more apt title which is far from a disappointment.

      Plenty of people have systems that run that game. Not many people have systems that can run that game with the settings cranked all the way up. There's a huge difference, and one which will apply to every current gen AAA title.

    • I saw a guy on Twitch playing this on an older rig, with a GTX 1080 or something like that. Still got decent graphics and fps.
      • I'm playing just fine on AMD 5700XT, and my sister had to turn some things down and playing just fine on much older RX580 8GB (it was a 200$ card 2 years ago).

        People having problems running are either:
        1) Trying to run with real time ray-tracing
        2) Higher resolutions than 1920x1080 (1080p)
        3) Other stupidly high setting that their hardware doesn't support well.

        Strangely AMD GPU users don't have any performance issues on wide range of cards, while NVIDIA users do. But then there are a lot of fancy graphics opti

        • I let the game sort out it's own graphic settings, don't think they're nearly as low as the could go though. And it's playing nicely on my GTX 960.

          The only bugs I've seen thus far are my health bar not showing up during a fight after a fight that initiated immediately after a re-load, and a bit of tutorial overlay that seemed to persist longer than it should have.

        • Checking in with a GTX 1070 and running just fine at 1440p. Granted, nothing is turned up past medium, but I haven't noticed a drop in frame rate even once through the first act.

          Can't wait for my GTX 3080 to arrive so I can crank it up and behold 4k beauty, and then probably crash due to ray-tracing or DLSS bugs.

    • Launches mean nothing. Almost every game released today deals with bugs and tech issues.

    • by awyeah ( 70462 ) *

      My 2070 Super supports DLSS.

    • Severely disappointing launch?

      Yeah no...

      https://finance.yahoo.com/news... [yahoo.com]

    • Except for them pre-selling more copies than literally any other game, ever [gamesindustry.biz]. Or having more people playing it simultaneously than any game ever sold through Steam [tomshardware.com].

      How disappointing. I would hate to fail like that.

  • Another cyberpunk ad.
    • It reads like a comment someone would leave on Steam as part of a game review.

      • For a Steam review that would be unusually informative. The best Steam reviews I read:
        Recommended: "You can pet the cat"
        Or:
        Recommended: "They made an 18+ game, then waited until everyone turned 18 before releasing it".
    • Anybody remembers the times when we had several WoW post per week [slashdot.org]?
    • Re:Cool... (Score:5, Insightful)

      by thegarbz ( 1787294 ) on Friday December 11, 2020 @12:02PM (#60819622)

      Yeah I know. We really need to not discuss the most anticipated game of 2020 on here along with a technical development. This is news for nerds, the only thing that matters is something something Trump something rigged election something.

    • Yeah, why in the world would a tech news aggregator want to talk about the most successful video game launch of 2020, and arguably ever?

      Don't like it? Feel free to scroll past it.

      • by Cederic ( 9623 )

        the most successful video game launch of 2020, and arguably ever

        Ah, bollocks to that. Highest grossing, sure, but successful? I've played games released in 2020 that didn't get a large number of people complaining of crashes, that didn't have performance issues, that didn't get reviews complaining about a stultifying story.

        I'm comfortable with calling it a successful release but chill with the hyperbole.

  • Good, now imagine that in a high resolution (no screen door) micro-LED VR headset. By 2028?

  • Well, we found the NVIDIA shill.

  • Control may have been praised for the feature, but why was he scratching his head? I mean Control if anything demonstrated that if you wanted full raytracing enabled that even an RTX2080 was not able to run the game at 1080p60 without dropping frames with DLSS off. I think if anything Control perfectly demonstrated what that feature was for: People who prefer ray tracing over edge sharpness who didn't spend $1500 on a graphics card.

    I don't for a moment buy that the reporter ran Control without performance i

    • The trick for getting high frame rates with ray tracing is to run games at 320x200.
      I'm getting over 9000 FPS!

      • by UnknownSoldier ( 67820 ) on Friday December 11, 2020 @12:44PM (#60819742)

        You might laugh but professional CS:GO [prosettings.com] gamers have been known to compete at lower resolutions such as 800x600, 1280x720, 1024x768, 1280x960, etc.

        For some anything less then 120 FPS is crap. For other 60 is the threshold. Console peasants are typically stuck at 30 FPS although thankfully that has finally started to change with this gen.

        • by Rockoon ( 1252108 ) on Friday December 11, 2020 @01:11PM (#60819802)
          You are talking about the (O)utput in I/O, but havent considered the (I)nput.

          Console gamers also do not have low latency, high update rate, ultra sensitivity (I)nput methods, but pc gamers do, which goes largely to creating a real need for the (O)utput also being low latency, with a high update rate on pc, but not on console.
          • That's a great point about Input Latency! Thanks for mentioning that.

            I'm not sure I would be quick to write console off "entirely" --- they do have a "fixed" set of hardware which "should" make this easier to manage -- they typically have 1 core dedicated to the hyper-visor. I'm not sure if input devices are handled there? I say "should" because I'm not aware of any studies comparing input latency on PC compared to consoles. Are you?

            Musicians tend to be ultra-sensitive about input + audio latency. I kno

        • by awyeah ( 70462 ) *

          FPS is one of those things where you don't really get it until you try it. Once I upgraded to a 165Hz monitor and get 150+ fps in most games... when I play at 60, it's definitely noticeable. Before I could do that, 60fps felt smooth as butter.

          Definitely a first-world problem!

        • A high frame rate is good, but aspect ratio is also important. I've owned an ultra-wide monitor for about a year now, and playing on a 16:9 monitor feels incredibly cramped and a lot less immersive. Imagine going from 16:9 to 4:3/5:4, but worst.

    • Yeah, the guy who wrote this was full of shit. DLSS is fucking amazing And looks way better than native resolution.

      • Hmm, DLSS is great, but it certainly doesn't look better than native resolution, at least not on my 3080.

        Even in DLSS Quality mode (which is the highest) there are some noticeable artifacts with DLSS that I can see when playing including temporal blurring artifacts (on far away detailed surfaces as the up-scaling struggles to fill in the gaps in information), ringing on high contrast edges (from too much sharpening), and more noise in certain textures (also from the upscaling struggling to properly fill in

      • Are you saying DLSS looks better than native resolution rather than disabling raytracing so you can get decent framerates? Or are you saying DLSS looks better full stop, because that is demonstrably false. There's plenty of examples of DLSS being sharper and less glitchy. That said the differences are minimal, certainly minimal enough that Raytracing+DLSS is definitely preferred.

        • by Kartu ( 1490911 )

          Check anand's article:
          https://www.anandtech.com/show... [anandtech.com]

          DLSS 2 is essentially a TAA derivative and comes with all its strength and weaknesses.

          It does improve lines (e.g. hair, eyebrows, grass), but wipes out fine details and adds blur, is particularly bad with small, quickly moving objects.

          The "better than native" claim being repeated so many times in regards of this technology is an embarrassment. Of course antialiasing could improve CERTAIN aspects of the visuals, but it has nothing to do with upscaling, t

          • No need, I run DLSS and know what it looks like. I'd classify it as pretty darn good, but definitely nothing like running at native resolution. I can only assume people claiming it's better than native are well overdue to have their prescription checked :-)

  • my CPU is an old i5 overclocked to about 3.9 GHz and it's a definite bottleneck on a lot of games.

    But I was told by esteemed Slashdot users that CPUs were irrelevant to gaming, and a powerful GPU card is all that matters!

    • That's true to an extent, but that processor can't even keep up with the thread count of consoles from 2013. 3.9GHz isn't even an overclock either, at best it's MCE.

      You gotta hand it to him for the dedication of pairing a $50 CPU with a $1200 graphics card though.

    • My CPU utilization while playing CyberPunk 2077 is 25-40% (I do have a 16 core 3700X though), while video card 5700XT stays stable at around 85-97% utilization. At least in my anecdotal case, feels like GPU is utilized more.

    • But I was told by esteemed Slashdot users that CPUs were irrelevant to gaming, and a powerful GPU card is all that matters!

      No, you were probably told something with a bit more nuance and complexity.

      Games rely far more on the GPU than the CPU, yes, but that doesn't make the CPU irrelevant. It means that if you're upgrading a single component, you're probably going to get more bang for the buck out of a GPU upgrade than a CPU upgrade.

      Moreover, CPUs hit diminishing returns far more quickly than GPUs do. A GTX3090 isn't going to turn your computer into something that can run CP2077 if your CPU is a 3rd-gen i3. Yes, THAT sort of upg

      • No, you were probably told something with a bit more nuance and complexity.

        I did mention this was a Slashdot poster right? There was no "nuance and complexity".

    • by yarbo ( 626329 )

      There are like 7 generations of i5 processors, is it a 3xxx or a 10xxx?

  • On my 4K TV, Control looked and ran identically well with and without DLSS

    He's either full of it, or he played Control without raytracing turned on. You couldn't get stable playable framerates with a 2080 Ti at 1440p with RT on without DLSS (I know, I tried), and he claims he played it in 4K and couldn't notice any difference? Yeah, right.

    • Definitely didn't have RT enabled.
      My 2080Ti chokes in Control doing RT without DLSS at 1080p. Though, I can concur that with DLSS it does 4K just fine with RT enabled, and straight up- you can't tell it's magically upscaled.

      Control was what sold me on DLSS.
      • and straight up- you can't tell it's magically upscaled.

        Probably because it's not. That's not how DLSS works.

        • Oh, I'm sorry.
          Should I have given the technical explanation?
          A neural network is trained using low-resolution and high-resolution renders until it learns to translate one to the other using a very complicated neural network. The network then takes the low-resolution renders produced on your card to magically^M^M^M^M^M^M^M^M^M upscale the low-resolution image.

          Were you trying to sound smart?
          • Yeah maybe I was being snarky. I guess my point was that it shouldn't appear so impressive compared to regular upscaling because it does actually get data from the higher resolution image unlike traditional upscaling which has nothing more to work with than the information in the lower res version. It's still a great technology though for increasing framerates at "high" resolution without requiring the card to actually render everything at the high res.
            • I guess my point was that it shouldn't appear so impressive compared to regular upscaling because it does actually get data from the higher resolution image unlike traditional upscaling which has nothing more to work with than the information in the lower res version.

              Snarkiness aside, that is a really good point.
              I hadn't really thought about it like that.
              I mean, it's pretty damn abstract since the "information" is a trained network of neurons, and it's general purpose (as of DLSS 2.0) but still, it is information. It's not that magical that it's better.

  • Another buzzword please.
  • I have a pretty powerful video card, an RTX 2080 TI, but my CPU is an old i5 overclocked to about 3.9 GHz and it's a definite bottleneck on a lot of games. Without DLSS, Cyberpunk 2077 was very hard to get running smoothly.

    DLSS does not reduce CPU workload.

  • The same amount of Dx12 commands are being made either way, DLSS or not, so why mentioning a CPU bottleneck is beyond me. It sounds to me like he's just letting his GPU stall for a majority of the frame just so the CPU can play catch up, probably in regards to the streaming resources aspect of Cyberpunk 2077. I'd suspect he'd get similar performance with a cheaper GPU for that reason. Though, from a rendering perspective, your CPU should ideally have minimal impact on rendering anyways esp when using Vulka
  • Cyberpunk 2077 converted me. DLSS is a miracle

    "Whoa."

C'est magnifique, mais ce n'est pas l'Informatique. -- Bosquet [on seeing the IBM 4341]

Working...