Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Television Games

Nvidia's RTX 3090 Demo Emphasizes the Absurdity of 8K Gaming (venturebeat.com) 142

Jeff Grubb, writing for VentureBeat: One of the things I would like you to associate with me is a skepticism of 4K gaming. I play in 4K on my PC using a 32-inch monitor that I sit a few feet away from, and that is great. But outside of that scenario, the 2160p resolution is wasted on our feeble human eyes -- especially when it comes with a sacrifice to framerate and graphical effects. And yet, I admit that Nvidia's marketing got to me when it showed gamers playing 8K games using the new RTX 3090. The idea of gaming at such fidelity is exciting. One of the elements that makes exploring 3D worlds so enthralling are the details, and -- well, you can get a lot of that at 4320p. But 8K gaming is still, of course, absurd. And the lengths that Nvidia had to go to show it off is evidence of that.

In its RTX 3090 promotional video, Nvidia had a number of livestreamers and influencers sit down to experience gaming at 4320p. The results seemed to impress everyone involved. The participants provided a lot of gasps and exclamations. But to get that reaction, the event had those influencers sitting just feet away from an 80-inch 8K LG OLED. And it takes something that extreme to get even the minimal benefits of that resolution. Even at 80 inches, you'd have to sit within 3 feet of the panel to notice a difference in pixel density between 4K and 8K. Now, I'm not saying I don't want to play games this way. I'd love to try it. And if I had an unlimited budget, maybe I'd dedicate a room in my manor to something like this. But even then, I would know that is silly.

This discussion has been archived. No new comments can be posted.

Nvidia's RTX 3090 Demo Emphasizes the Absurdity of 8K Gaming

Comments Filter:
  • will not be on apple arm systems!

    • True, but Apple doesn't care. Nobody buys a Mac for gaming.

      The people who buy this will be the gaming equivalent of audiophools/audiophiles. If you believe it sounds better then it sounds better.

      The expensive price reinforces that belief.

      • by fazig ( 2909523 )
        High fidelity 3D art also requires quite a bit of graphical processing power and video memory.
        And there I suspect an RTX 3090 to be still more affordable than a workstation graphics card with all the VRAM. It should be able to run high end texturing software like Mari decently, which previously required workstation graphics to not lag all the time.

        Though of course it probably won't be as performant when it comes to double precision, but since most 3D art is done with single precision or even half precisi
    • Apple added support for external GPUs a while back, so it's possible these GPUs could be put in an enclosure and used on an ARM-Mac, but the chances of games being ported to take advantage of that setup is pretty low.
    • NV sells ARM boards with workstation GPUs attached for doing edge computing and automotive. It is technically possible to have working Vulkan/Metal accelleration on an ARM Mac. Apple's weird driver infrastructure is probably the only barrier.

  • by Chronus1326 ( 1769658 ) on Tuesday September 01, 2020 @01:57PM (#60462450)
    Its for the new Nvidia Holodeck, not 8k gaming.
  • VR? (Score:5, Insightful)

    by nealric ( 3647765 ) on Tuesday September 01, 2020 @01:58PM (#60462452)

    Isn't the justification for absurdly high resolutions like 8k VR headsets? Even at 4K, VR is not a "retina" display level of detail because the screens are just centimeters from your eyes.

    • by kaybee ( 101750 )

      Yes, exactly. I just ordered an 8K VR headset for the new Flight Simulator because I know 4K (2K per eye) isn't super great (I own an Oculus Quest).

      • That sounds frankly awesome.
        • by kaybee ( 101750 )

          I hope so. On my GTX 2080 Super I'm only getting 30-40fps on my single 4K monitor. So I'm hoping the GTX 3090 can really produce triple the performance so I can get up to around 60FPS at 8K as they talked about in the launch video.

        • Sounds frankly foolish. Only one VR set has been certified for MS FS 2020, and the other leading VR sets have significantly varying hardware specs.

          Buy a VR headset because you have money to piss away, but if you're going to put up significant amounts of money, be sure you have use cases that makes it worth the investment.

      • Yes, exactly. I just ordered an 8K VR headset

        There are no 8k HMDs available for sale. There are vendors advertising 8K HMDs who apparently can't be bothered with basic math (e.g. 3840x2160x2 != 7680x4320)

        3840x2160x2 = 16588800
        7680x4320 = 33177600

        • by kaybee ( 101750 )

          Sorry, you are correct. I ordered the HP Reverb 2 which is about 9 million pixels. I have just learned that it is just a tad bit more on the pixel count than a 4K display. So that's great news really. It is 2.5x more pixels than the Oculus Quest which I have and enjoy but am bothered by the pixelation. I hope that will help. But at "only" about 9 million pixels it won't be much tougher than a 4K display for the GTX 3090 playing Flight Simulator.

    • by tlhIngan ( 30335 )

      Isn't the justification for absurdly high resolutions like 8k VR headsets? Even at 4K, VR is not a "retina" display level of detail because the screens are just centimeters from your eyes.

      Most likely, because you need both resolution AND framerate.

      People aren't going to use this to play games at 8k60. They're for playing it at 4k120 smoothly (because VR is less tolerant of framerate drops) in order to be realistic and not make people nauseous.

    • by AmiMoJo ( 196126 )

      Even on desktop 4k isn't enough. With a 27" 4k monitor at arms length I can see the pixels, and my eyesight isn't exceptional.

      It's only about 160 DPI. 8k might be overkill but it will be visually perfect.

      • Wait until you get close to 50. At this age, there's not much difference between even 1080p and a classical Game Boy display, apart from the fact that the GB looks a bit greener.

        • I'm close to 50 and the difference between 1080p and 4k is night and day to me.
          • Well of course. I mean you watch the news during the day on your 1080p kitchen TV, and movies during the night on your 4K living room TV.

          • by Cederic ( 9623 )

            I'm finding very much that as technology finally reaches the level that high framerate 4k resolutions are viable, I can't tell the bloody difference anyway.

            Although I game as 2560x1440 so I'm already doing rather better than 1080p.

            Still, if the card can do 8k at a reasonable framerate then it should finally deliver the 4k framerates that would justify an upgrade. Now I just need to get a job so that I can afford it.

        • Re:VR? (Score:5, Interesting)

          by Miamicanes ( 730264 ) on Tuesday September 01, 2020 @04:25PM (#60463062)

          > At this age, there's not much difference between even 1080p and a classical Game Boy display,

          BULLSHIT

          I'm close to 50, and upgrading from a 24" 1920x1080 monitor to a 28" 3840x2160 monitor was UNQUESTIONABLY the best $350 I've ever spent. It's a night and day difference, especially for anyone who uses an IDE like IntelliJ or Android Studio and prefers tiny (7-9 point) fonts.

          Ditto, for my phone, which has 2560x1440 on a 6" screen. At one point, I contemplated buying a Pixel 3a XL, until I saw it side by side with my old Nexus 6P & felt like my eyes were going to bleed. Sure, maybe, if you use HUGE fonts, the difference might not be as big... but if you're like me, and use tiny fonts to pack more info onto the screen at once, higher resolution is ESSENTIAL.

          I'd go so far as to say that the difference in legibility between 1920x1080 and 3840x2160 is comparable to the difference between having or not having 0.25 to 0.50 diopters of uncorrected astigmatism. At 3840x2160, the interiors of letters like "e" and "B" (when black text is displayed on white) are crisply-defined and high-contrast. At 1920x1080, they're reduced to gray haze.

          TL/DR: as you get older, having a high-quality high-resolution display is MORE important, not less.

          • Yep. I think 4K is of dubious benefit to games, and I can only _barely_ make out a difference sitting 5-6 feet from my 65" 4K TV. But, it creates a definite benefit for anyone who stares at a screen of text for hours a day. It's not like things become readable where they weren't before, more that it just doesn't strain your eyes over time quite as much.
          • by keltor ( 99721 ) *
            Humorously smaller pixels actually DOES make it easier to look at the display. (and it will take like 32k 24" desktop monitors before there's actually no benefit to increase DPI.)
          • At this age, there's not much difference between even 1080p and a classical Game Boy display

            BULLSHIT

            Well, duh. Why do you think I picked a freakin' classical Game Boy display for comparison?!

          • It doesn't even need to be tiny text, nor do you need to be old. Text is just so much better at 4K at all sizes, it's hard to go back after using it for a day.

      • Absolutely, and that is why I am excited about this. 5k seems to be the sweet spot, which is why Apple started making 5k monitors in the first place, but just about nobody else is making monitors higher res than 5k. Luckily TVs are going to push 8k just because they need a higher number to advertise, so we will finally have options for very high ppi monitors just by buying 8k TVs.
      • Comment removed based on user account deletion
      • by q_e_t ( 5104099 )
        My wife has better than 20/20 vision (it caused hilarity at an opticians when waiting for a friend recently when she could read the entire initial screening sight card from several feet behind her friend) and she can't see the individual pixels. She can see the flicker on 60Hz LED displays very easily, though.
    • No, the justification is that Jensen wants a new leather jacket.

      It's weird that this is the only article we got on Ampere. Yeah this is basically the ridiculous top-end card, but the actual big news is that the 3070 and 3080 seem to provide huge performance increase both in rasterization and especially RT stuff at the same price point as the last gen. The early benchmarks suggest 60-80% for raster performance and up to 100% in pure raytracing tasks. https://www.youtube.com/watch?... [youtube.com]

      • by Cederic ( 9623 )

        That's nice to hear. Makes me feel superior for intentionally skipping the current generation :)

    • Not quite. The most high resolution headset coming soon, not even on the market yet (HP Reverb G2) has a 2160x2160 display per eye. That is almost 4K resolution. There's no reason at all to think that this kind of yet to be released display with double the resolution of the current high resolution display will be one-upped within a generational release of graphics cards. Maybe 8K will be relevant for the NVIDIA 4090 whenever that gets on a roadmap, but right now VR isn't the killer app.

      Where you do see adve

  • Possible advantage (Score:5, Insightful)

    by Tontoman ( 737489 ) * on Tuesday September 01, 2020 @01:59PM (#60462458)
    If it were "Where's Waldo?" kind of game, someone would have a potentially much larger map to search through.
  • At this rate I will never get up to date. I only just recently got a machine that can do decent 1080P gaming.
    Although 8K does sound sexy.
    • Just a few months ago I got a 1920x1080 monitor. Previously was 1680x1050 which served me quite well for quite a few years, and it never bothered me. All the newer monitor does it make it wider, helpful on my mmo to keep just to the sides. I really only got it because I wanted a second input in case I wanted to plug in my work laptop.

      4K is ok for work, I have some coworkers who swear by it. But for gaming you'd need a separate A/C unit to keep everything from melting.

  • by UnknowingFool ( 672806 ) on Tuesday September 01, 2020 @02:01PM (#60462468)
    Pffft. You haven’t lived until you’ve played Minecraft at 8K with ray tracing.
  • It''s the same old expansionist business model that has been getting us into trouble for decades.

    Growth for Growth's sake. Whether we need more of whatever is growing or not. No room for solid businesses that fill a need for just as long as they are needed, and gracefully unroll when the need declines. No. Grow and expand until you crash and burn. Nobody will invest unless you have a plan for world dominance.

    • Actually, I think the GTX 30xx series is a great thing, because it's a more powerful card than your old if you stay at your usual 1080/1280/whatever (under 4K/8K). To me, a constant frame rate and ultrawide are more important than any resolution above 1080p.

      • by Cederic ( 9623 )

        ultrawide [is] more important than any resolution above 1080p.

        So higher resolution is more important to you than higher resolution?

        • Ultrawide is an aspect ratio, not a resolution. In fact I recently upgraded to a 2560x1080 ultrawide so my GPU would have less pixels to compute and be able to give me higher framerates.

          • by Cederic ( 9623 )

            1080p means 1920x1080.

            Ultrawide has more pixels. It's a different screen resolution. For example, 2560 * 1080 does not equal 1920 * 1080. You don't even have the same pixel count.

            If you'd gone to 2700x768 then you'd have changed aspect ratio. I'd still state that you'd changed resolution, as the screen resolution is different, even though the pixel count is the same.

            So you do care about screen resolution, and indeed you've adopted one that's greater than 1080p. Maybe only in one direction, but it still coun

  • Super sampling (Score:5, Interesting)

    by Anonymous Coward on Tuesday September 01, 2020 @02:07PM (#60462500)

    Another good use is super sampling to get far better anti-aliasing effects, for those of us bothered by such things.

    My video card renders a scene at 4k, then samples that down to 1080 for display.
    This lets you take advantage of hardware accelerated anti aliasing at full frame rate.

    Software anti aliasing is a trade off between how well it is done and how much processing it takes to do it, usually resulting in a lower frame rate.
    (Sometimes even other lag, if a games software AA is particularly crappy)

    An 8k capable video card could do this with a 4k monitor, without the expense of an actual 8k monitor. Just as I had cheaped out not getting a 4k monitor.

    It could be much less of a difference in effect from 8k to 4k, compared to 4k down to 1k. I'm sure the early adopters will answer that with sample images soon enough.

    • Just out of curiosity, if your video card is rendering the scene at 4K before downsampling, does that mean that you could get the same frame rate outputting 4K with AA turned off? Also, on a 4K display is AA even necessary?
    • Comment removed based on user account deletion
      • DLSS is leveraging AI to only super sample areas that need it without having to render full 8k and downsize.

        DLSS is upscaling. So it would render at 1440p and then upscale it to 4K. It is just a trick to make faster rendering possible since most people can't tell real 4K from upscaled 1440p.

  • this was published after the livestream I can't help but think they had it ready to go. Makes me wonder if they had another article that bashed Nvidia for getting complacent if they didn't mention 8k.

    Would have been hilarious if they fucked up and published the wrong article.

    • With how quick this was published after the livestream I can't help but think they had it ready to go. Makes me wonder if they had another article that bashed Nvidia for getting complacent if they didn't mention 8k.

      Would have been hilarious if they fucked up and published the wrong article.

      They only wrote one article. nVidia didn't pay for two.

  • Until I can play Crysis at 8k resolution, it's vapor. :)

    • Comment removed based on user account deletion
    • Well I've got Good News [rockpapershotgun.com] for you then!

      Crysis Remastered will have 8k resolution textures, which is more pixels than my poor eyes can even handle. It will also have ray traced reflections for surfaces and water. Crytek donâ(TM)t appear to have released any PC system requirements yet but I imagine the old âoecan it run Crysis?â joke will still stand in 2020.

  • Maybe for monitors it's a bit overkill, but that sounds great for VR, which is still in need of better resolution.

    If you increase the field of view to anything like actual human vision rather than what they do now, 8K isn't that many pixels.

    I find headsets like the Quest a bit limiting for instance, it takes time to get used to that you just don't see somebody standing to the side of you that you would really see in real life, and being aware of what's going around can take a quite unnatural amount of head

    • by cirby ( 2599 )

      According to current wisdom, when we get to 16K, we'll be at about "real" human eye resolution.

      That's just a few years off, at current increase.

      • Well, 8K multiplied by two eyes equals 16K, so we're already there!

        • by vadim_t ( 324782 )

          No, the "K" refer to the number of lines in the display. If you double the horizontal resolution you also have to double the vertical one.

          8K proper is 2x2 4K displays, so 4 times the number of pixels. 16K then is 2x2 8K displays, meaning 16 times more pixels than a 4K screen today.

          So if your current system renders Witcher 3 at 4K at 60 FPS just barely, then at 16K the proportional speed would be 3.7 FPS. The amount of horsepower 16K requires is brutal.

      • by ceoyoyo ( 59147 )

        Only if you're insanely wasteful (or talking about monitors). Your eyes have a small central high acuity region and the rest is crappy resolution. Even if you did build a VR display that was uniformly high resolution everywhere, because grids are easiest, it would be silly to render the whole FOV that way.

  • by klipclop ( 6724090 ) on Tuesday September 01, 2020 @02:18PM (#60462550)
    Having performance and bandwidth increases is not absurd, it's progress. Faster technology will allow innovation to trickle down into other tech fields. What is absurd is saying it's absurd.
  • by kaybee ( 101750 ) on Tuesday September 01, 2020 @02:19PM (#60462556) Homepage

    This is correct; 8K is overkill for a single monitor. But as somebody who is just getting into the new Flight Simulator I can tell you there are other scenarios:

    1) Three 2K monitors in a wrap-around configuration for simulators is 6K of resolution.

    2) The upcoming HP VR headset is 4K per eye, or 8K total. As an Oculus Quest owner I'll tell you that 2K per eye isn't all that clear.

    But for a single TV sure. I have a 4K OLED TV (55") for my gaming monitor and sit about 3 feet away and it's plenty big and 4K is plenty detailed.

    • 1) Three 2K monitors in a wrap-around configuration for simulators is 6K of resolution.

      In the horizontal - but not the vertical. Total pixels would be 2560x1440x3 = ~11 million. An 8k display = ~33.2 million. So not quite the same. However, I do agree that the selling point for such a video card is with multiple displays that wrap around a user.

    • 2) The upcoming HP VR headset is 4K per eye, or 8K total

      4k is 8.2m pixels
      8k is 33m pixels

      HP reverb is 4.6m pixels per eye for a total of 9.3m pixels across both eyes.

      • by kaybee ( 101750 )

        Hmm, you are correct, I'm not sure why I didn't know that. So the new HP Reverb 2 is pretty close to a 4K monitor. That's good news, I hope it will perform really well with the RTX 3090! :)

    • The upcoming HP VR headset is 4K per eye

      Not really. Each eye is square. It has a vertical resolution of 4K (2160px) per eye, but not the horizontal resolution. In total it is 4320 x 2160 which is only a tad over what is colloquially referred to as "4K" 3840 x 2160

      Still impressive, but it's not an 8K headset.

  • Not for anything it can do but for its price... If AMD launches a card that doesn't mach this power but is both more reasonable in price/performance and powe usage, then perhaps they can pull a Ryzening on Nvidia as well and that is overdue.

    • by EvilSS ( 557649 )
      AMD won't be competing with the 3090 though. What AMD needs to beat on price/performance are the 3070 and 3080, and that's going to be a tall order.
  • It may be pointless visually, but being able to run at 120FPS @ 8K with your gamer friends is impressive... I'll figure you are stupid, but the script kiddies who aspire to be professional E-Sports players will love it... To each their own...

    • > being able to run at 120FPS @ 8K with your gamer friends is impressive... I'll figure you are stupid, but the script kiddies who aspire to be professional E-Sports players will love it

      You DO realize that most professional CS:GO players run at 1024x768, [csgopedia.com] or 1280x960, right? After 1920x1080, 1280x1024 some even use 1680x1050 [prosettings.net].

  • Every time I'm on a business trip descending into a city I look out the window and think, "how could a flight sim ever capture that almost-infinite detail?"

    I dunno. I have no experience with 8k. But I plugged a laptop supporting only 1080p into my 4k desktop monitor and I was surprised how crappy it looks, so the author's skepticism even of 4k is suspicious to me.

  • I'd rather burn GPU cycles on gobs of frame rate with minimal judder, tearing, and distance blurring, along with pristine HDR, over 8K resolution any time. Crap, I'll take even more of the same at 1080p.

  • by Miamicanes ( 730264 ) on Tuesday September 01, 2020 @02:40PM (#60462642)

    The same GPU power that enables 7680x4320@60fps can enable 3840x2160@120-240fps with comparable quality settings.

    The same GPU power that enables 3840x2160@240fps with high quality settings might enable 1920x1080@120fps with realtime raytracing instead of matrix-math surface-mapping tricks.

    And in any case, "8k" lies at the LOWER end of what a GPU driving a VR HMD really needs to be capable of.

    In theory, the eye's "resolution" is commonly stated to be "1 arc second", which works out to 60 pixels per degree FOV. 60-degree FOV is pretty much the absolute lower end of remotely-tolerable tunnel vision. 90-degree FOV is regarded as the minimum anyone really wants to use. 120-degree FOV is approximately the point where the limited FOV is something you're constantly aware of.

    For 60-degree FOV, 1 arc-second resolution works out to 3600 pixels. For 90, 120, and 150-degree FOV, it comes out to 5400, 7200, and 9000 pixels. Per eye.

    But wait, it gets worse.

    The eye's "resolution" is only 1 arc second if you're considering what it can instantaneously resolve. The thing is, your eyes also constantly move around, building up the scene in a manner not unlike synthetic aperture radar. So... to literally eliminate every perceptible hint of a non-antialiased pixel, you have to go even higher. A lot higher. Like, "order of magnitude" higher. Obviously, that isn't happening anytime soon, so we're going to be stuck with things like antialiasing for a long time.

    It's also misleading to talk about the eye's "resolution" in its periphery. Yes, indeed, something seen entirely with peripheral vision is comparatively low-resolution and monochromatic... but unless you're either rendering wavefronts onto a contact lens centered on the pupil, you still need to be able to deliver full resolution wherever the eye happens to be looking at that instant. This is the basis behind foveated rendering... combining a display with resolution high enough to give you full resolution wherever the eye happens to be looking, while tracking WHERE the eye is looking to concentrate the GPU's power on that area.

    There's a big 'gotcha' with foveation, though... your peripheral vision might be low-res, but it's ALSO extraordinarily sensitive to motion, and your brain subconsciously tracks it to look for "things that aren't quite right". Guess what? Foveated rendering sets off MAJOR subconscious alarm bells. At the very least, it throws off your situational awareness, and you end up kind of like a kid with glasses who has astigmatism & a big strength difference between his left and right eyes trying to catch a baseball... you end up metaphorically getting hit in the face by the baseball, because too many things disrupt the brain's ability to merge and track the object until it's too late.

    We have a DEPRESSINGLY long way to go before VR is going to stop feeling "unreal" after more than a few seconds. We need higher resolution displays with faster framerates, ENORMOUSLY faster rendering pipelines to reduce latency, and what I like to refer to as "fog" or "plasma" computing... basically, a cloud-like mini-mainframe sitting a few feet away that's the equivalent of a small rack of i9 motherboards with RTX GPUs. We're not going to get anywhere close to the necessary level of hardware with the equivalent of a top-shelf Android phone (at least, not without tethering it to both power cables AND a refrigerant loop or chilled-water line to keep it from giving you first-degree burns through your clothing).

    That's not to say we won't have "usable", and even "fun/enjoyable" VR, before we get to that point... for things like passively watching a 3D movie in a virtual private movie theater, even present hardware is semi-ok. But even then, you'd really want to sit someplace where your head can be held immobile, to avoid breaking the illusion of immersion with lag and "slosh". And after a while, you'll start to feel like you're watching a DVD projected into a screen with a cheap LCoS projector that suffers from major screen

    • ^--- whoops, major brain fart... I accidentally wrote "arc seconds" where I should have written "arc MINUTES".

      Everything else, including the math, is fine... there are 60 arcminutes per degree, so the resolutions I listed are correct.

    • The same GPU power that enables 7680x4320@60fps can enable 3840x2160@120-240fps with comparable quality settings.

      This is not correct though, at least for the version of 8K that TFA is talking about. These are entirely different execution units and so they don't have any relation.

      They are getting 8K@60fps by rendering at 1440p and using an AI upscaler to get it to 8K. Actual 8K perf (i.e. the thing that might scale to better FPS at lower resolutions) is still impractically low for the games tested -- something like 20fps in some games.

      The same GPU power that enables 3840x2160@240fps with high quality settings might enable 1920x1080@120fps with realtime raytracing instead of matrix-math surface-mapping tricks.

      Raytracing also uses different execution units, so you can't assume any relation betw

  • I just built a gaming rig for a friend and he insisted on a 4K monitor. Sitting about 3 feet from my 1080p and his new 4K monitor really yields nothing special to my eyes (I'm 52). I can't imagine 8K doing anything even remotely special. 3D hologram like screens or 3D that works without glasses, now that would be a show stopper to me for gaming.
    • Depending on what the game is, it may not make a big difference. One of the places I've seen 4K shine is with RTS-type games where you can zoom way out from the map but not lose detail. With FPS-type games, the difference isn't as noticeable.

      The other thing to do is pull up some text (webpage, whatever) on both screens and compare. The 4K monitor is able to render text some much clearer and cleaner - it's well worth it just for that.

  • While it probably doesn't make to much sense on some game which attempts to re-create some sort of 3D virtual camera image, there is potential for other types of games. For example a game like Sim City could benefit from being able to show the whole grid at once. You wouldn't need to scroll any more.

    Of course at a distance of half a meter an 8k screen would need to be roughly 80 inches and probably curved.

  • VR... even dual 4k is not enough!

  • by rsilvergun ( 571051 ) on Tuesday September 01, 2020 @05:39PM (#60463284)
    with 8gb ram standard. It's crazy that my $200 RX 580 has 8GB (and I bought it used for $100) while the 5600XT has only 6gb. Hell there are 8GB 570s still selling for around $130. And don't get me started on the 1660 line from Nvidia.

    Stop gimping your cards so they don't compete with your top end.
  • Even at 80 inches, you'd have to sit within 3 feet of the panel to notice a difference in pixel density between 4K and 8K.

    This is a stupid argument. Well duh just get a 160 inch TV then. People always say stuff like this when the tech is new. When 120 inch TVs are everywhere 8K gaming will indeed become quite useful.

  • 1) I play on a 35 inch ultra wide QHD screen, on PC. When I downgrade it to 1080p everything's pretty damn ugly. The display is a Spectre 35, I bought it for 350USD, it is a 100hz, gsync panel. Great for gaming and not that expensive. I can only assume this category is exploding right now.

    2) If a graphics card is fast enough for 8k gaming, it is faster for 4k gaming. The best gaming setup at 4k right now run things like MS Flight Simulator 2020 at less than 60fps with all details to max... at 1080p. Until

"It's the best thing since professional golfers on 'ludes." -- Rick Obidiah

Working...