Nvidia's RTX 3090 Demo Emphasizes the Absurdity of 8K Gaming (venturebeat.com) 142
Jeff Grubb, writing for VentureBeat: One of the things I would like you to associate with me is a skepticism of 4K gaming. I play in 4K on my PC using a 32-inch monitor that I sit a few feet away from, and that is great. But outside of that scenario, the 2160p resolution is wasted on our feeble human eyes -- especially when it comes with a sacrifice to framerate and graphical effects. And yet, I admit that Nvidia's marketing got to me when it showed gamers playing 8K games using the new RTX 3090. The idea of gaming at such fidelity is exciting. One of the elements that makes exploring 3D worlds so enthralling are the details, and -- well, you can get a lot of that at 4320p. But 8K gaming is still, of course, absurd. And the lengths that Nvidia had to go to show it off is evidence of that.
In its RTX 3090 promotional video, Nvidia had a number of livestreamers and influencers sit down to experience gaming at 4320p. The results seemed to impress everyone involved. The participants provided a lot of gasps and exclamations. But to get that reaction, the event had those influencers sitting just feet away from an 80-inch 8K LG OLED. And it takes something that extreme to get even the minimal benefits of that resolution. Even at 80 inches, you'd have to sit within 3 feet of the panel to notice a difference in pixel density between 4K and 8K. Now, I'm not saying I don't want to play games this way. I'd love to try it. And if I had an unlimited budget, maybe I'd dedicate a room in my manor to something like this. But even then, I would know that is silly.
In its RTX 3090 promotional video, Nvidia had a number of livestreamers and influencers sit down to experience gaming at 4320p. The results seemed to impress everyone involved. The participants provided a lot of gasps and exclamations. But to get that reaction, the event had those influencers sitting just feet away from an 80-inch 8K LG OLED. And it takes something that extreme to get even the minimal benefits of that resolution. Even at 80 inches, you'd have to sit within 3 feet of the panel to notice a difference in pixel density between 4K and 8K. Now, I'm not saying I don't want to play games this way. I'd love to try it. And if I had an unlimited budget, maybe I'd dedicate a room in my manor to something like this. But even then, I would know that is silly.
will not be on apple arm systems! (Score:2)
will not be on apple arm systems!
Re: (Score:3)
True, but Apple doesn't care. Nobody buys a Mac for gaming.
The people who buy this will be the gaming equivalent of audiophools/audiophiles. If you believe it sounds better then it sounds better.
The expensive price reinforces that belief.
Re: (Score:3)
And there I suspect an RTX 3090 to be still more affordable than a workstation graphics card with all the VRAM. It should be able to run high end texturing software like Mari decently, which previously required workstation graphics to not lag all the time.
Though of course it probably won't be as performant when it comes to double precision, but since most 3D art is done with single precision or even half precisi
Re: (Score:2)
8K means not having to fake higher resolution with stop-gap garbage like antialiasing.
That's what they said about 4K.
I game at 4K on a 50 inch display and I can see the individual pixels. I would welcome 8K, but my aging 2080 isn't up to it.
8K won't fix that. What you really need is a monitor with better ratio of dots to black-space-around-the-dots.
(and/or sit a bit further away).
Re: will not be on apple arm systems! (Score:2)
Re: will not be on apple arm systems! (Score:2)
NV sells ARM boards with workstation GPUs attached for doing edge computing and automotive. It is technically possible to have working Vulkan/Metal accelleration on an ARM Mac. Apple's weird driver infrastructure is probably the only barrier.
Re: will not be on apple arm systems! (Score:2)
It's for the new (Score:3, Funny)
VR? (Score:5, Insightful)
Isn't the justification for absurdly high resolutions like 8k VR headsets? Even at 4K, VR is not a "retina" display level of detail because the screens are just centimeters from your eyes.
Re: (Score:3)
Yes, exactly. I just ordered an 8K VR headset for the new Flight Simulator because I know 4K (2K per eye) isn't super great (I own an Oculus Quest).
Re: (Score:2)
Re: (Score:2)
I hope so. On my GTX 2080 Super I'm only getting 30-40fps on my single 4K monitor. So I'm hoping the GTX 3090 can really produce triple the performance so I can get up to around 60FPS at 8K as they talked about in the launch video.
Re: (Score:2)
Sounds frankly foolish. Only one VR set has been certified for MS FS 2020, and the other leading VR sets have significantly varying hardware specs.
Buy a VR headset because you have money to piss away, but if you're going to put up significant amounts of money, be sure you have use cases that makes it worth the investment.
Re: (Score:3)
Yes, exactly. I just ordered an 8K VR headset
There are no 8k HMDs available for sale. There are vendors advertising 8K HMDs who apparently can't be bothered with basic math (e.g. 3840x2160x2 != 7680x4320)
3840x2160x2 = 16588800
7680x4320 = 33177600
Re: (Score:2)
Sorry, you are correct. I ordered the HP Reverb 2 which is about 9 million pixels. I have just learned that it is just a tad bit more on the pixel count than a 4K display. So that's great news really. It is 2.5x more pixels than the Oculus Quest which I have and enjoy but am bothered by the pixelation. I hope that will help. But at "only" about 9 million pixels it won't be much tougher than a 4K display for the GTX 3090 playing Flight Simulator.
Re: (Score:3)
Most likely, because you need both resolution AND framerate.
People aren't going to use this to play games at 8k60. They're for playing it at 4k120 smoothly (because VR is less tolerant of framerate drops) in order to be realistic and not make people nauseous.
Re: (Score:2)
Even on desktop 4k isn't enough. With a 27" 4k monitor at arms length I can see the pixels, and my eyesight isn't exceptional.
It's only about 160 DPI. 8k might be overkill but it will be visually perfect.
Re: (Score:3)
Wait until you get close to 50. At this age, there's not much difference between even 1080p and a classical Game Boy display, apart from the fact that the GB looks a bit greener.
Re: (Score:2)
Re: (Score:2)
Well of course. I mean you watch the news during the day on your 1080p kitchen TV, and movies during the night on your 4K living room TV.
Re: (Score:2)
I'm finding very much that as technology finally reaches the level that high framerate 4k resolutions are viable, I can't tell the bloody difference anyway.
Although I game as 2560x1440 so I'm already doing rather better than 1080p.
Still, if the card can do 8k at a reasonable framerate then it should finally deliver the 4k framerates that would justify an upgrade. Now I just need to get a job so that I can afford it.
Re:VR? (Score:5, Interesting)
> At this age, there's not much difference between even 1080p and a classical Game Boy display,
BULLSHIT
I'm close to 50, and upgrading from a 24" 1920x1080 monitor to a 28" 3840x2160 monitor was UNQUESTIONABLY the best $350 I've ever spent. It's a night and day difference, especially for anyone who uses an IDE like IntelliJ or Android Studio and prefers tiny (7-9 point) fonts.
Ditto, for my phone, which has 2560x1440 on a 6" screen. At one point, I contemplated buying a Pixel 3a XL, until I saw it side by side with my old Nexus 6P & felt like my eyes were going to bleed. Sure, maybe, if you use HUGE fonts, the difference might not be as big... but if you're like me, and use tiny fonts to pack more info onto the screen at once, higher resolution is ESSENTIAL.
I'd go so far as to say that the difference in legibility between 1920x1080 and 3840x2160 is comparable to the difference between having or not having 0.25 to 0.50 diopters of uncorrected astigmatism. At 3840x2160, the interiors of letters like "e" and "B" (when black text is displayed on white) are crisply-defined and high-contrast. At 1920x1080, they're reduced to gray haze.
TL/DR: as you get older, having a high-quality high-resolution display is MORE important, not less.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Well, duh. Why do you think I picked a freakin' classical Game Boy display for comparison?!
Re: (Score:2)
It doesn't even need to be tiny text, nor do you need to be old. Text is just so much better at 4K at all sizes, it's hard to go back after using it for a day.
Re: (Score:2)
Not that a 3090 would necessarily even handle 4k 140hz gaming at full quality.
Depends on the game, I suspect. It has over twice the shaper TFLOPS of a 2080Ti, and my 2080Ti pulls between mid 50s to mid 90s in 4K depending on the game.
Re: (Score:2)
Re: (Score:3)
5k 27" is exactly 2x the standard 96 DPI so Apple could use perfect pixel doubling.
Anyway, yeah we need 8k monitors, hopefully soon.
Isn't that part of the rational for 8k? (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
No, the justification is that Jensen wants a new leather jacket.
It's weird that this is the only article we got on Ampere. Yeah this is basically the ridiculous top-end card, but the actual big news is that the 3070 and 3080 seem to provide huge performance increase both in rasterization and especially RT stuff at the same price point as the last gen. The early benchmarks suggest 60-80% for raster performance and up to 100% in pure raytracing tasks. https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
That's nice to hear. Makes me feel superior for intentionally skipping the current generation :)
Re: (Score:2)
Not quite. The most high resolution headset coming soon, not even on the market yet (HP Reverb G2) has a 2160x2160 display per eye. That is almost 4K resolution. There's no reason at all to think that this kind of yet to be released display with double the resolution of the current high resolution display will be one-upped within a generational release of graphics cards. Maybe 8K will be relevant for the NVIDIA 4090 whenever that gets on a roadmap, but right now VR isn't the killer app.
Where you do see adve
Re: (Score:2)
Re: (Score:2)
Oh I stand corrected. Hadn't heard of this headset before. Cheers.
Possible advantage (Score:5, Insightful)
I will never get caught up (Score:2)
Although 8K does sound sexy.
Re: (Score:2)
Just a few months ago I got a 1920x1080 monitor. Previously was 1680x1050 which served me quite well for quite a few years, and it never bothered me. All the newer monitor does it make it wider, helpful on my mmo to keep just to the sides. I really only got it because I wanted a second input in case I wanted to plug in my work laptop.
4K is ok for work, I have some coworkers who swear by it. But for gaming you'd need a separate A/C unit to keep everything from melting.
Obviously he hasn’t played played the "right (Score:4, Funny)
Bigger. Faster. More. etc. (Score:2)
It''s the same old expansionist business model that has been getting us into trouble for decades.
Growth for Growth's sake. Whether we need more of whatever is growing or not. No room for solid businesses that fill a need for just as long as they are needed, and gracefully unroll when the need declines. No. Grow and expand until you crash and burn. Nobody will invest unless you have a plan for world dominance.
Re: (Score:2)
Actually, I think the GTX 30xx series is a great thing, because it's a more powerful card than your old if you stay at your usual 1080/1280/whatever (under 4K/8K). To me, a constant frame rate and ultrawide are more important than any resolution above 1080p.
Re: (Score:2)
ultrawide [is] more important than any resolution above 1080p.
So higher resolution is more important to you than higher resolution?
Re: (Score:2)
Ultrawide is an aspect ratio, not a resolution. In fact I recently upgraded to a 2560x1080 ultrawide so my GPU would have less pixels to compute and be able to give me higher framerates.
Re: (Score:2)
1080p means 1920x1080.
Ultrawide has more pixels. It's a different screen resolution. For example, 2560 * 1080 does not equal 1920 * 1080. You don't even have the same pixel count.
If you'd gone to 2700x768 then you'd have changed aspect ratio. I'd still state that you'd changed resolution, as the screen resolution is different, even though the pixel count is the same.
So you do care about screen resolution, and indeed you've adopted one that's greater than 1080p. Maybe only in one direction, but it still coun
Super sampling (Score:5, Interesting)
Another good use is super sampling to get far better anti-aliasing effects, for those of us bothered by such things.
My video card renders a scene at 4k, then samples that down to 1080 for display.
This lets you take advantage of hardware accelerated anti aliasing at full frame rate.
Software anti aliasing is a trade off between how well it is done and how much processing it takes to do it, usually resulting in a lower frame rate.
(Sometimes even other lag, if a games software AA is particularly crappy)
An 8k capable video card could do this with a 4k monitor, without the expense of an actual 8k monitor. Just as I had cheaped out not getting a 4k monitor.
It could be much less of a difference in effect from 8k to 4k, compared to 4k down to 1k. I'm sure the early adopters will answer that with sample images soon enough.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
DLSS is leveraging AI to only super sample areas that need it without having to render full 8k and downsize.
DLSS is upscaling. So it would render at 1440p and then upscale it to 4K. It is just a trick to make faster rendering possible since most people can't tell real 4K from upscaled 1440p.
With how quick (Score:2)
this was published after the livestream I can't help but think they had it ready to go. Makes me wonder if they had another article that bashed Nvidia for getting complacent if they didn't mention 8k.
Would have been hilarious if they fucked up and published the wrong article.
Re: (Score:2)
With how quick this was published after the livestream I can't help but think they had it ready to go. Makes me wonder if they had another article that bashed Nvidia for getting complacent if they didn't mention 8k.
Would have been hilarious if they fucked up and published the wrong article.
They only wrote one article. nVidia didn't pay for two.
but can it play Crysis... (Score:2)
Until I can play Crysis at 8k resolution, it's vapor. :)
Re: (Score:3)
Re: (Score:2)
Well I've got Good News [rockpapershotgun.com] for you then!
Crysis Remastered will have 8k resolution textures, which is more pixels than my poor eyes can even handle. It will also have ray traced reflections for surfaces and water. Crytek donâ(TM)t appear to have released any PC system requirements yet but I imagine the old âoecan it run Crysis?â joke will still stand in 2020.
8K would be nice for VR headsets (Score:2)
Maybe for monitors it's a bit overkill, but that sounds great for VR, which is still in need of better resolution.
If you increase the field of view to anything like actual human vision rather than what they do now, 8K isn't that many pixels.
I find headsets like the Quest a bit limiting for instance, it takes time to get used to that you just don't see somebody standing to the side of you that you would really see in real life, and being aware of what's going around can take a quite unnatural amount of head
Re: (Score:2)
According to current wisdom, when we get to 16K, we'll be at about "real" human eye resolution.
That's just a few years off, at current increase.
Re: (Score:2)
Well, 8K multiplied by two eyes equals 16K, so we're already there!
Re: (Score:2)
No, the "K" refer to the number of lines in the display. If you double the horizontal resolution you also have to double the vertical one.
8K proper is 2x2 4K displays, so 4 times the number of pixels. 16K then is 2x2 8K displays, meaning 16 times more pixels than a 4K screen today.
So if your current system renders Witcher 3 at 4K at 60 FPS just barely, then at 16K the proportional speed would be 3.7 FPS. The amount of horsepower 16K requires is brutal.
Re: (Score:2)
Only if you're insanely wasteful (or talking about monitors). Your eyes have a small central high acuity region and the rest is crappy resolution. Even if you did build a VR display that was uniformly high resolution everywhere, because grids are easiest, it would be silly to render the whole FOV that way.
Not absurd... (Score:3)
Correct but only in single-monitor situations (Score:5, Insightful)
This is correct; 8K is overkill for a single monitor. But as somebody who is just getting into the new Flight Simulator I can tell you there are other scenarios:
1) Three 2K monitors in a wrap-around configuration for simulators is 6K of resolution.
2) The upcoming HP VR headset is 4K per eye, or 8K total. As an Oculus Quest owner I'll tell you that 2K per eye isn't all that clear.
But for a single TV sure. I have a 4K OLED TV (55") for my gaming monitor and sit about 3 feet away and it's plenty big and 4K is plenty detailed.
Re: (Score:3)
1) Three 2K monitors in a wrap-around configuration for simulators is 6K of resolution.
In the horizontal - but not the vertical. Total pixels would be 2560x1440x3 = ~11 million. An 8k display = ~33.2 million. So not quite the same. However, I do agree that the selling point for such a video card is with multiple displays that wrap around a user.
Re: (Score:3)
2) The upcoming HP VR headset is 4K per eye, or 8K total
4k is 8.2m pixels
8k is 33m pixels
HP reverb is 4.6m pixels per eye for a total of 9.3m pixels across both eyes.
Re: (Score:2)
Hmm, you are correct, I'm not sure why I didn't know that. So the new HP Reverb 2 is pretty close to a 4K monitor. That's good news, I hope it will perform really well with the RTX 3090! :)
Re: (Score:2)
The upcoming HP VR headset is 4K per eye
Not really. Each eye is square. It has a vertical resolution of 4K (2160px) per eye, but not the horizontal resolution. In total it is 4320 x 2160 which is only a tad over what is colloquially referred to as "4K" 3840 x 2160
Still impressive, but it's not an 8K headset.
I love the RTX3090 (Score:2)
Not for anything it can do but for its price... If AMD launches a card that doesn't mach this power but is both more reasonable in price/performance and powe usage, then perhaps they can pull a Ryzening on Nvidia as well and that is overdue.
Re: (Score:2)
But, but... It's not bragging if it's true! (Score:2)
It may be pointless visually, but being able to run at 120FPS @ 8K with your gamer friends is impressive... I'll figure you are stupid, but the script kiddies who aspire to be professional E-Sports players will love it... To each their own...
Re: (Score:2)
> being able to run at 120FPS @ 8K with your gamer friends is impressive... I'll figure you are stupid, but the script kiddies who aspire to be professional E-Sports players will love it
You DO realize that most professional CS:GO players run at 1024x768, [csgopedia.com] or 1280x960, right? After 1920x1080, 1280x1024 some even use 1680x1050 [prosettings.net].
Re: (Score:2)
1024x768? 1920x1080?! Ewwwww!
21:9 or nothing, baby!
Flight simulators (Score:2)
I dunno. I have no experience with 8k. But I plugged a laptop supporting only 1080p into my 4k desktop monitor and I was surprised how crappy it looks, so the author's skepticism even of 4k is suspicious to me.
FPS+HDR (Score:2)
I'd rather burn GPU cycles on gobs of frame rate with minimal judder, tearing, and distance blurring, along with pristine HDR, over 8K resolution any time. Crap, I'll take even more of the same at 1080p.
TFA misses the point (Score:5, Insightful)
The same GPU power that enables 7680x4320@60fps can enable 3840x2160@120-240fps with comparable quality settings.
The same GPU power that enables 3840x2160@240fps with high quality settings might enable 1920x1080@120fps with realtime raytracing instead of matrix-math surface-mapping tricks.
And in any case, "8k" lies at the LOWER end of what a GPU driving a VR HMD really needs to be capable of.
In theory, the eye's "resolution" is commonly stated to be "1 arc second", which works out to 60 pixels per degree FOV. 60-degree FOV is pretty much the absolute lower end of remotely-tolerable tunnel vision. 90-degree FOV is regarded as the minimum anyone really wants to use. 120-degree FOV is approximately the point where the limited FOV is something you're constantly aware of.
For 60-degree FOV, 1 arc-second resolution works out to 3600 pixels. For 90, 120, and 150-degree FOV, it comes out to 5400, 7200, and 9000 pixels. Per eye.
But wait, it gets worse.
The eye's "resolution" is only 1 arc second if you're considering what it can instantaneously resolve. The thing is, your eyes also constantly move around, building up the scene in a manner not unlike synthetic aperture radar. So... to literally eliminate every perceptible hint of a non-antialiased pixel, you have to go even higher. A lot higher. Like, "order of magnitude" higher. Obviously, that isn't happening anytime soon, so we're going to be stuck with things like antialiasing for a long time.
It's also misleading to talk about the eye's "resolution" in its periphery. Yes, indeed, something seen entirely with peripheral vision is comparatively low-resolution and monochromatic... but unless you're either rendering wavefronts onto a contact lens centered on the pupil, you still need to be able to deliver full resolution wherever the eye happens to be looking at that instant. This is the basis behind foveated rendering... combining a display with resolution high enough to give you full resolution wherever the eye happens to be looking, while tracking WHERE the eye is looking to concentrate the GPU's power on that area.
There's a big 'gotcha' with foveation, though... your peripheral vision might be low-res, but it's ALSO extraordinarily sensitive to motion, and your brain subconsciously tracks it to look for "things that aren't quite right". Guess what? Foveated rendering sets off MAJOR subconscious alarm bells. At the very least, it throws off your situational awareness, and you end up kind of like a kid with glasses who has astigmatism & a big strength difference between his left and right eyes trying to catch a baseball... you end up metaphorically getting hit in the face by the baseball, because too many things disrupt the brain's ability to merge and track the object until it's too late.
We have a DEPRESSINGLY long way to go before VR is going to stop feeling "unreal" after more than a few seconds. We need higher resolution displays with faster framerates, ENORMOUSLY faster rendering pipelines to reduce latency, and what I like to refer to as "fog" or "plasma" computing... basically, a cloud-like mini-mainframe sitting a few feet away that's the equivalent of a small rack of i9 motherboards with RTX GPUs. We're not going to get anywhere close to the necessary level of hardware with the equivalent of a top-shelf Android phone (at least, not without tethering it to both power cables AND a refrigerant loop or chilled-water line to keep it from giving you first-degree burns through your clothing).
That's not to say we won't have "usable", and even "fun/enjoyable" VR, before we get to that point... for things like passively watching a 3D movie in a virtual private movie theater, even present hardware is semi-ok. But even then, you'd really want to sit someplace where your head can be held immobile, to avoid breaking the illusion of immersion with lag and "slosh". And after a while, you'll start to feel like you're watching a DVD projected into a screen with a cheap LCoS projector that suffers from major screen
Re: (Score:2)
^--- whoops, major brain fart... I accidentally wrote "arc seconds" where I should have written "arc MINUTES".
Everything else, including the math, is fine... there are 60 arcminutes per degree, so the resolutions I listed are correct.
Re: (Score:2)
The same GPU power that enables 7680x4320@60fps can enable 3840x2160@120-240fps with comparable quality settings.
This is not correct though, at least for the version of 8K that TFA is talking about. These are entirely different execution units and so they don't have any relation.
They are getting 8K@60fps by rendering at 1440p and using an AI upscaler to get it to 8K. Actual 8K perf (i.e. the thing that might scale to better FPS at lower resolutions) is still impractically low for the games tested -- something like 20fps in some games.
The same GPU power that enables 3840x2160@240fps with high quality settings might enable 1920x1080@120fps with realtime raytracing instead of matrix-math surface-mapping tricks.
Raytracing also uses different execution units, so you can't assume any relation betw
The difference is minimal (Score:2)
Re: (Score:2)
Depending on what the game is, it may not make a big difference. One of the places I've seen 4K shine is with RTS-type games where you can zoom way out from the map but not lose detail. With FPS-type games, the difference isn't as noticeable.
The other thing to do is pull up some text (webpage, whatever) on both screens and compare. The 4K monitor is able to render text some much clearer and cleaner - it's well worth it just for that.
Re: (Score:2)
He's Gen X not a boomer you imbecile.
He also didn't comment on whether the technology would be useful for others, or should be available for others.
So you're basically wrong on everything you said. No wonder you posted anonymously.
Re: (Score:2)
It clearly depends on the type of game (Score:2)
While it probably doesn't make to much sense on some game which attempts to re-create some sort of 3D virtual camera image, there is potential for other types of games. For example a game like Sim City could benefit from being able to show the whole grid at once. You wouldn't need to scroll any more.
Of course at a distance of half a meter an 8k screen would need to be roughly 80 inches and probably curved.
One word (Score:2)
VR... even dual 4k is not enough!
Just give me better sub $200 cards (Score:3)
Stop gimping your cards so they don't compete with your top end.
just go larger than 80 (Score:2)
Even at 80 inches, you'd have to sit within 3 feet of the panel to notice a difference in pixel density between 4K and 8K.
This is a stupid argument. Well duh just get a 160 inch TV then. People always say stuff like this when the tech is new. When 120 inch TVs are everywhere 8K gaming will indeed become quite useful.
Bad argument is bad (Score:2)
1) I play on a 35 inch ultra wide QHD screen, on PC. When I downgrade it to 1080p everything's pretty damn ugly. The display is a Spectre 35, I bought it for 350USD, it is a 100hz, gsync panel. Great for gaming and not that expensive. I can only assume this category is exploding right now.
2) If a graphics card is fast enough for 8k gaming, it is faster for 4k gaming. The best gaming setup at 4k right now run things like MS Flight Simulator 2020 at less than 60fps with all details to max... at 1080p. Until
Re: (Score:2)
Re: (Score:2)
Bot flies are real. Brent Spiner is a person.
You don't know very much about bots.
Re: (Score:2)
I'm sitting 2-1/2 feet from a 1920x1080 monitor. It's inadequate for small text.
8k is where the "more pixels is better" philosophy is going to stop. There may be specialized applications where higher resolution is useful, but for entertainment, graphics, business, development, and general computing there is no advantage beyond 8k.
Re: (Score:2)
For a television, which GP was referring to, you either need to sit closer than optimum viewing distance or go higher than ~70" to notice the different between 1080p and 4K.
Re: (Score:2)
My living room TV is 60" at 1920x1080. I can easily make out pixels, even while sitting ~7 feet away.
The "optimum viewing distance" is WAY closer (and lower) than most people think it is. Most people put their TVs WAY too high. Ideally, the top of the screen should be just inside your field of vision when you're looking straight ahead and slightly downward, which for most TVs implies that if your TV is more than a foot above the floor, it's probably too high. Likewise, the TV's width should fill around 60-8
Re: (Score:2)
Re: (Score:2)
LOL. That's the joke. 640K was absurd. No advantage to have anything beyond that. 8K is just as absurd now as 640K was back when Bill Gates said it.
Except that Billy G. never said it. For the record, I hate M$ as much as anyone, but there is simply no evidence that he ever said it. Hate him if you must, but at least make it for the right stuff.
Re: (Score:2)
What's to hate? Its been fifteen years since Windows OS was affected by a Bill Gates technical or marketing decision. Its been twenty years since the DOJ could make an antitrust case on Microsoft based on what it did to STAC (and Netscape).
Its a less monopolistic environment in computing now, than twenty years ago. Businesses are looking to move to cloud computing, and even then, its not like Azure is a favorite. People can even "game" on Linux, even though the most exclusive games are only available on
Re: (Score:2)
I'm betting it won't happen, because it's bumping up against human wetware limitations. It's the same reason that audio hasn't really moved beyond 16-bit 48Khz - it's literally as good as it needs to get. There's no good reason to develop TVs or monitors with resolution higher than what the majority of people can even perceive.
That being said, one area we could benefit from still higher resolution is VR, since it requires such a wide field of view only a few cm from your eyes.
Re: (Score:2)
Hell, there's no advantage (for computing) beyond 2160p monitors.If you actually could use more screen space, it'd probably be cheaper and more ergonomic to use two 4K displays.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
PCMR means many things to different people.
To me, PCMR means a 1080 ultrawide display and keyboard+mouse controls.
For others, it's 240Hz and HDR.
Etc.