Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
XBox (Games) Microsoft Television

Xbox's Phil Spencer Isn't Sure 8K Will Ever Be Standard in Video Games (ign.com) 123

Xbox boss Phil Spencer has said that he isn't sure if 8K resolution will ever be standard for video games, calling it "aspirational technology." From a report: Talking to Wired, Spencer said," I think 8K is aspirational technology. The display capabilities of devices are not really there yet. I think we're years away from 8K being -- if it ever is -- standard in video games." Spencer's comments come despite the Xbox Series X being able to support 8K output. However, while it may technically be able to push video at a resolution of 7680 x 4320, there are more factors to consider, chiefly being if anyone even has an 8K television or monitor to view such visuals on. According to Wired's chat with Liz Hamren, head of gaming engineering at Xbox, Microsoft's data suggests that 4K TV adoption is less than what publishers may think, and so that suggests 8K adoption is still years away at least.
This discussion has been archived. No new comments can be posted.

Xbox's Phil Spencer Isn't Sure 8K Will Ever Be Standard in Video Games

Comments Filter:
  • streaming that may kill your download cap in a few hours

    • by Computershack ( 1143409 ) on Tuesday October 06, 2020 @01:24PM (#60578256)
      Europe to America....what's a download cap?
    • What download cap and why would I be streaming games when I can play locally?

      Also Phil Spencer is short-sighted. Perhaps 8K is a long way off for consoles and their underpowered hardware, but it's not that far off for gaming PCs. I already play almost everything at 4K now. Once I upgrade to an RTX 3090, 8K should be possible.

      • by noodler ( 724788 )

        Yeah, but you're probably doing yourself a disservice by pushing for 8k.
        A gpu has only limited resources, so the trade off becomes "more" vs "better" looking pixels.
        To get the same pixel quality at 8k you'd need 4x the shader power. And the 3090 sure doesn't have 4x the shader power compared to the 2k range so quality will take a big hit.
        I also don't think you will notice any difference in resolution between 4k and 8k in practice. It's a nice idea and all, but it's mostly bullshit unless you have a giant TV

        • I do have a giant TV. That's the point of going 8K for me.

          • by noodler ( 724788 )

            Sure, but how far do you sit from it?
            I bet that if you have anti-aliasing enabled you won't be able to tell the difference between 4k and 8k AND you will have more GPU power left to render better pixels.
            I saw people looking at video on a 80" 4k and 8k TV -from up close- and they couldn't tell the difference.
            If you're putting you GPU to work on 8k you're wasting a lot of picture quality for resolution that you probably won't see.
            This whole 8k thing is mostly just a scam to sell more screens. It's just like H

    • Comment removed based on user account deletion
  • Cool? Yes. Necessary? Probably not. At 3840x2160 on a reasonably small and/or distant display, anti-aliasing has already become largely unnecessary to achieve excellent image quality. Even a doubling of pixel count would probably render it irrelevant on all but the largest displays. Past a certain point, a single dark pixel will look the same as a larger, slightly less dark pixel. Could be relevant to split-screen gaming, I suppose.
    • In a typical game rendering scenario, a conservative game will buffer 12 bytes per pixel during frame rendering, and at "8K" thats almost 400MB. One of these highly advanced deferred shading games will use more like 24 bytes per pixel.

      ..to be clear, thats just the technical rendering overhead, not the textures, not the shadow maps, not the light maps, not the environment maps, ....

      400MB * 60 FPS = 24GBps of bandwidth consumed just on the technicalities of rendering, not counting the actual data to be re
    • by AvitarX ( 172628 )
      Agreed.

      with a 75 inch TV you need to be about 5 feet away to start seeing the difference.

      Even for gaming I doubt many people want to be so close to such a beast.

      The same TV only sees benefits over 1080p at 10 feet (also HDR).

      If it weren't for HDR I personally would have gone with 1080p since I don't game much and like to lay back on a couch that puts me more than 10 feet away for movies.

      I'm sure there are some people that want to sit less than 5 feet away from a 65 inch TV, but I doubt it's many, much bette
    • by rtb61 ( 674572 )

      People always forget, resolution does not exist on it's own, size of screen is the biggest driver of resolution and of course distance from display. Get closer to bigger displays you need more resolution and lets be honest, people want to cover that entire wall with a screen, so say 2m x3.5m screen and the resolution to use that from a distance of say 3.0m, the whole wall lit up with what ever drives your fancy at that time, often scenery. Using various areas of the screen for various outputs. The flip side

  • Besides slowing down the game since it is more work to display, can your eye notice the difference while playing the game?
    • No word about frame rates? I would rather have HD at a decent frame rate than 4k at a sluggish or inconsistent rate. That's where the effort should be going.
    • by Merk42 ( 1906718 )
      Of course I can! Why, I could still see the difference in 16K! Sorry about your bad eyes Grandpa!
    • Besides slowing down the game since it is more work to display, can your eye notice the difference while playing the game?

      Yes, on a large screen sitting at typical gaming distances. It's probably the last step though. Full HD to Ultra HD is a huge improvement for me, and I'm still very happy to leave my desktop scaling at 100%. Going to 8K (eventually), I will finally want to scale up my desktop a bit. When a minimum height pixel art font can finally be rendered so small that it's difficult to read even if I lean in, then we're finally at the highest resolution I want. I want a resolution slightly better than my vision.

      • by swilver ( 617741 )

        I want a resolution slightly better than my vision

        No need to upgrade, just wait a few years.

  • by Clouseau2 ( 1215588 ) on Tuesday October 06, 2020 @01:26PM (#60578264)
    VHS to DVD was a huge jump in quality. DVD to HD was a smaller jump. HD to 4K is often barely noticeable. I doubt many will care about 4K vs 8K.
    • by dgatwood ( 11270 )

      But... but... if we don't have 8K, how can we see the mole on Lara Craft's elbow without zooming in?

      8K is ridiculous. 4K is borderline ridiculous as a delivery medium. The only real benefit to increasing resolution is on the camera side, where shooting in 8K means you can crop significantly before the quality falls below the 1080p level that your customers can just barely tell is better than 720p.

      8K games? The need for more resolution is at this point a made-up problem. The only plausible benefit to pu

      • Now that peasants can afford a 65 inch 4k TV and a gaming console, the industry needs a new label and a new buzzword to move forward with hardware upgrading hysteria.

        • I think that they were hoping that "4K VR" was going to be the next buzzword, since most people don't have a VR headset yet, and having the power 2 4K displays on your head will require a next generation video card.

    • A few months ago I upgraded to an ultra-wide, 2560x1080 monitor because IMHO gaming in 21:9 gives a better experience than 16:9. The jump to 3440x1440 meant a more expensive monitor and a more expensive GPU.

    • I doubt many will care about 4K vs 8K.

      has the last 100 years of consumerism and marketing taught you nothing?

      customer:"what's the difference between 4k and 8k?"
      sales-drone: "well the 8k roughly twice as good"
      customer: "I like good, so more good is better. i'll take it!"

    • If you drew a graph of how displays have progressed and flipped it horizontally you'd have the graph of how good my eyes are.

      I think the lines crossed when 720p came out.

    • by gweihir ( 88907 )

      The usual posers will love 8K, but I think the sane rest can safely ignore it.

    • You should put on a VR headset like the PSVR 2 before you decide that the difference between 4K and 8K are irrelevant.

    • by r_naked ( 150044 )

      VHS to DVD was a huge jump in quality. DVD to HD was a smaller jump. HD to 4K is often barely noticeable. I doubt many will care about 4K vs 8K.

      In this case size DOES matter. I would agree with you on a 50 inch or lower display device, but I just recently went from a 65 inch HD to an 85 inch 4k and it most certainly is noticeable. Before you say not many people will be buying 75 or 85 inch displays -- they have dropped in price drastically this past year.

  • EVER? (Score:5, Insightful)

    by jason.hall ( 640247 ) on Tuesday October 06, 2020 @01:26PM (#60578272)
    He doubts if it will EVER be a standard? Even in 100 years? Ever is a long time. And 640KB is enough for anyone...
    • Probably not. I mean - will 2D array resolutions remain relevant if we move to light field projection displays?

    • Maybe not "ever", but possibly for quite a while, because there's a limit to our ocular wetware. There's little point in exceeding what the human eye can discern.

      It's similar to how very few people are clamoring for audio bitrates and depths beyond 16-bit/48Mhz, simply because the vast majority of humans can't perceive any difference if you increase those specs. It's literally just a waste for CPU horsepower, bandwidth, and storage space that could be better used elsewhere.

      Note that this may not apply to

      • audio beyond 16-bit/48Mhz, simply because the vast majority of humans can't perceive any difference if you increase those specs

        Make that 44KHz, and 100% kind of counts as "vast majority".

        Not a single human has demonstrated hearing of frequencies above slightly over 20KHz in a controlled environment. Note that most audio equipment will produce artifacts in the audible range if there's ultrasound input, but that's a speaker error. Once you eliminate that (eg. by having two speakers, one for sub-20KHz, one for above-20KHz).

        As of 16bit, humans actually can hear a difference -- in a laboratory quiet room, with volume set to near the p

        • Piffle, I have done exactly that on more than one occasion and I'm hardly special in that regard. I'm older now so I may not hear those sounds nearly as well as I did in the past.

          Some people can hear up into the mod 20's for their hearing range. When I went through MEPS I had to perform my hearing test three times as they wanted to make sure my hearing test was accurate. The second time with the hearing doctor on hand. The third time with the technician from the manufacturer and the hearing doctor on hand t

    • by Tablizer ( 95088 )

      [Ever?] Even in 100 years? Ever is a long time. And 640KB is enough for anyone...

      Head-plugged holographic projections will replace video by then. Now git off my flying lawn!

    • He doubts if it will EVER be a standard? Even in 100 years? Ever is a long time. And 640KB is enough for anyone...

      While I share your sentiment our retinas do have a finite resolution and adding more resolution after reaching that would be pointless. So in this particular case by the time you reach 640K, it almost certainly is more than enough for anyone!

      • But what if it's a 100ft (30m) screen? then surely even then 640K wouldn't be enough, you'd see the pixels!
        • What's the optimal resolution for 'stadium' gaming at an IMAX theatre? :)

        • But what if it's a 100ft (30m) screen? then surely even then 640K wouldn't be enough, you'd see the pixels!

          Not if you were far enough away from it that you could see the whole picture.

      • Hmm...
        I can easily tell the difference between 144DPI and 300 DPI printing.
        I can tell the difference between 300DPI and 600 DPI, though it's not easy.
        I know people who can tell the difference between 600DPI and 1200DPI.

        If I hold a piece of paper at normal viewing distance, it covers about 1/2 my monitor in my field of view.
        300 DPI on a normal piece of paper is 2400x3300. Twice that (remember, it covers half the monitor) is 4800x3300, or roughly 4K.

        In other words,
        300DPI is approximately the same res as 4K.
        6

        • I work in printing (though I'm not so technical).

          From what I can tell though, the DPI in a print does not have the full bit depth as a monitor. As in each dot can't have 256 levels of C,M,Y, and K.

          So a 1,200 dpi printer has roughly 5.7MB of data an inch (4 * 1200 * 1200), a screen you look at though holds 24 bits per a pixel only needs 500 dpi to convey that much data (500*500*24 = 6MB).

          Obviously there's a balance where jagged edges can't be smoothed with better color depth, but with printing, 1,200 dpi is not about the resolution of the edges, it's about the ability to simulate color depth and give a smooth grading of color.

          With RGB color there are some colors that will step with only 8bits per a channel (specifically in a blue gradient), and there are some colors that can't be represented (same with CMYK printing too though), but for the most part in a display you have enough color data per pixel that you don't need to worry about the dots being smaller than can be perceived for the sake of mixing colors, while on a printer you most definitely do.

          Note, some printers will use light colors to effectively give them more bits per a dot too. I'm making the assumption you're talking about a stochastic screen on a coated paper using a 4 color printer.
        • by gweihir ( 88907 )

          You are overlooking that all printing is full-on or full-off and grey and color tones have to be simulated. No such limit in screen-pixels. Your calculation is badly off as a result.

          • For laser. There are a lot of inkjets with one or more grey ink tanks for anti-aliasing text and better photo reproduction.

            • by gweihir ( 88907 )

              Still badly off. These printers cannot even remotely produce the colors a monitor can in one pixel.

        • I did the same DPI calculation: 4K is about equivalent to 300 DPI. As far as I am concerned, a 300 DPI laser print is good quality. The text and lines are crisp, without anti-aliasing needed to mask the jaggies. On that basis, anything better than 4K is a waste, unless it is on a very large screen.

          Another point that I think is generally true is that dynamic scenes require less DPI than static prints. I believe this is the basis for some forms of lossy video compression.

    • by nagora ( 177841 )

      He doubts if it will EVER be a standard? Even in 100 years?

      Sure. It will be just as completely pointless in 1000 years as it is today. Why make a standard of something that's useless?

    • There is not enough 8K adoption yet. We may jump to 6K, 16K or whatever next. 8K does make some sense, but there is no warranty it will be adopted yet. 4K may also be considered good enough for the next 20 years.

    • by gweihir ( 88907 )

      Look at other things: B/W Laser printing. The resolution increases pretty much stopped at 1200dpi. Sure, some printers claim more, but there is no use-case for it. Most documents get printed at 600dpi, because it is entirely enough.

      4K is the resolution where the human eye cannot see defective pixels anymore without a magnifier, given the usual monitor sizes. IBM explored that about 25 years ago. It is one factor which will eventually make 4K the standard, because manufacturing and testing is a lot cheaper.

    • "Ever is a long time."

      Unless human eyes are going to change, it seems fair to say "ever".

    • I get your points, certainly, but the human eyeball only has a certain resolution.

      My understanding is that to even SEE the difference in a 4k tv (from 1080p HDR) you have to sit like 4' from the screen...otherwise human vision is MECHANICALLY unable resolve the distinction.

      https://www.forbes.com/sites/k... [forbes.com]

      Personally, I believe that we're going to push past 4k and maybe even 8k *but in regards VR video* not giant screens. I'm not broadly experienced myself with VR, but my understanding is in that context we

  • Only for VR... (Score:4, Interesting)

    by Junta ( 36770 ) on Tuesday October 06, 2020 @01:30PM (#60578288)

    4k at realistic non-VR FoV (even at the extreme of like a 100" (250cm) screen at 6' (2m) would be the same for 4 or 8k).

    For a VR headset needing to cover the whole range, the panel would have to have about 8k to be indistingushible resolution-wise from normal view. The rendering horsepower might be relaxed if you have foveated rendering with eyetracking.

  • If the rendering is sub-photorealistic, what advantage is there to greater resolution?
    • That's what I was thinking. If games were aiming for (and actually achieving) photorealistic instead of stylistic I could see potential benefits of throwing more pixels at it. Throwing more pixels at toonshaders, though, just makes the outlines less jaggy.
  • Look up the field of view range and ability to resolve features for 80+ of humanity.

    Of course at extreme contrast ratios, nothing below 120fps, and per eye.

    But I bet "videophiles" will demand 128K for "extra warmth". :)

  • by jklappenbach ( 824031 ) on Tuesday October 06, 2020 @02:01PM (#60578384) Journal
    Players may not notice the difference between 4k and 8k on a standard display. But for VR, everyone will certainly notice. Having 4k per eye and a GPU that can drive it will really raise the bar for the technology.

    Still think VR is a joke?

    Try Asguard's Wrath [oculus.com], Half-life Alyx [half-life.com], or the latest mind-blowing release of Star Wars Squadrons [ea.com] (which made my inner 8yo squee with absolute delight -- this is what it feels like to actually pilot an X-Wing or TIE Fighter!), or any of the major releases that have come out in the last year and a half, and you'll find that the platform isn't just maturing, it's taking off.

    8k can't come soon enough.
    • Players may not notice the difference between 4k and 8k on a standard display. But for VR, everyone will certainly notice. Having 4k per eye and a GPU that can drive it will really raise the bar for the technology. Still think VR is a joke? Try Asguard's Wrath [oculus.com], Half-life Alyx [half-life.com], or the latest mind-blowing release of Star Wars Squadrons [ea.com] (which made my inner 8yo squee with absolute delight -- this is what it feels like to actually pilot an X-Wing or TIE Fighter!), or any of the major releases that have come out in the last year and a half, and you'll find that the platform isn't just maturing, it's taking off. 8k can't come soon enough.

      4K + 4K is not the same as 8K unless the marketers have taken control again.

  • ...640 k ought to be enough for everyone.
  • When we digitize it. Therefore, there will be a move to 8K. Even slighly higher resolutions, if we want to profit from iMax films.

    Beyond that, is hardly tenable.

    You will have pro monitors at 14K or more for editors (in order to produce in 8K you need to record/mix/edit/postproduce in higher resolutions, for 8K that is currently 12K). and you will have 8~10K monitors/TVs for normal people to consume content (8K for 35mm equivalent content, 10K for imax and other 70mm formats) . Non-gaming content will drive

    • Didn't they record in 35mm mainly for the same reason you mentioned editing in a higher resolution? To enable them to have some overhead to zoom in post-production if they wanted, to be able to blow up a picture, etc...?

      The problem you run into is that even 4k is beyond most people's vision abilities. People won't be able to tell the difference between 4k and 8k in real world situations.

      I'm thinking that doing so might require a shift to headsets or such, so you get enough field of view to use all those p

    • Yes, you can get physical 8k from 35mm film if you really want to depict every grain in it's original location. But the resolution of the image is usually far less, 4k
  • by unami ( 1042872 ) on Tuesday October 06, 2020 @02:20PM (#60578444)
    so it will come to videogames. Obviously it doesn't make much sense in 2D - hell, even most movies in cinemas are still in 2k and hardly anyone complains about them being pixelated.
  • Considering many people don't have a 4K TV yet and in North America you are lucky to get 25 Mbit, streaming 8K doesn't seem critical. Also, unless you are getting a TV larger than many people have as a living room, such as a cinema, then you are essentially asking for super-charged racing truck for when all you need is a good car. In the end, 8K at this point in time is really just porn for the spec nerd and those getting duped by the marketing.

    Note, that 8K is about 80-100 Mbit/s, based on compression rate

    • I don't think bandwidth is all that relevant since games are practically always rendered locally.
  • eagles.

    4K is a waste of time, money, and electricity. 8K is just a joke.

    • by gweihir ( 88907 ) on Tuesday October 06, 2020 @09:12PM (#60579714)

      4K actually has a manufacturing advantage: At 4K with standard monitor sizes humans cannot see single defective pixels anymore. Hence manufacturing becomes quite a bit cheaper. So eventually, the standard will be 4K, but that may still take a while. And at that point, the technology will be mature and that is it.

  • Bring on dual 8k with 180 degrees fov for vr baby...

  • How about you stop improving the resolution of games and instead improve the scope and scale of them?

  • who doesn't offer VR unlike their competitor is not sure why developers would support their competitor's hardware.

    Wake up dude, 8K gaming has nothing to do with your shitty TV.

    • hololens and windows mixed reality and would like a word!
      • Two products nothing to do with Xbox and aren't sold with the xbox would like a word? Why? Do they also not understand what I'm saying?

        • Given VR has absolutely nothing to do with the gaming resolution of TV's I figured you were not closed to just the topic at hand.
          • What has VR being a component of the Playstation got to do with the TV again? I think you fundamentally missed my point. To re-itterate: the main competition to the xbox is all in on VR. For that the ability to play 8K is important. For your TV, not at all.

            • all in on VR? VR is a relative niche add on to a PS. It sold relatively poorly to the number of ps and is unlikely to significantly improve on that this gen unless they find some magic bullet to the cost. So no Sony is not all in on VR, VR is an extra, they are all in on living room and primarily TV gaming.
        • Given the scope of the discussion was quite clearly gaming on TV as he also points out 8k gaming WILL be available in other mediums (e.g. PC) it sounds like you are the one that didn't really understand.
          • Given the scope of the discussion was quite clearly gaming on TV as he also points out 8k gaming WILL be available in other mediums (e.g. PC) it sounds like you are the one that didn't really understand.

            Why do you arbitrarily move goalposts to create a senseless argument? It's not like PC gaming is inundated with 8K monitors. Hell steam shows most gamers aren't even using 4K screens, all of which is completely irrelevant since:
            a) the topic he was covering was fundamentally graphics power of the Xbox.
            b) his complaint was about "The display capabilities of devices are not really there yet."
            c) his competitors are about to ship consoles along with accessories with display capabilities exceeding 4K

            So please lea

            • you were the one that moved the goal posts. There was never any statement of 8k will NEVER be for gaming, it was about 8k for television based gaming. The Xbox Series X is capable of 8k video streams out of the box and therefore any accessories etc would also be capable of that. None the less it is does not negate his statement that 8k is unlikely to be the gaming standard for TV any time soon if ever. I would image the competition has pretty much the same agenda unless they are trying to push 8k TV;s.
  • since my eyes max out around 1080p and even a good set of eyes around 4k. At a certain point you're just setting it to 11.
    • since my eyes max out around 1080p and even a good set of eyes around 4k. At a certain point you're just setting it to 11.

      at my former place of employment they got 4K monitors for everyone when fitting out the new office. Very nice, but I couldn't read any of the test on the screen without serious eye strain. So mine got run at 1080p and I was not alone. It was easier to low rez the monitors than to mess with scaling every time I swirtched between WFO and WFH.

  • Everyone is mentioning VR below, and I agree with the general idea. But 8k isn't double 4k, it's 4x the pixels. Do we actually need better than 4k per eye?
    • Regardless of what we need, Moore's law will ensure that processing 8k data becomes cheap enough, and the competition plus progress in LCD space will bring down the cost of 8k panels. In 6-8 years, lo and behold, the only TV you can buy is 8K LCD, just like today any decent TV now has 4K resolution.

    • Everyone is mentioning VR below

      Side note: When you post after them, it's actually above.

  • But I wouldn't assume that we're always going to be using a single flat screen in front of the user (player) for visual output. At some point, home users will be able to afford something like a Cave system [visbox.com]. Take a dozen or so 4K or 8K monitors, add AR glasses to that (so individuals can have different HUDs), and you need to push a lot of pixels.

    It may not make a lot of sense for the average home gamer kid to have 8K, but even 8K is just a stepping stone to other things.

  • For lack of a better word, we need more "depth" of the pixels, not a higher resolution. According to "science" (online charts), at the distance I am viewing, I barely need a 1080p TV. Anything higher would be wasteful (though I can actually see some minor difference). For 8K, I think I would need to sit two feet away.

    However more bits per pixel, larger color gamut, and faster refresh rates are welcome. I can easily see their difference. In fact it is easy to see difference between not only SDR, HDR, but als

  • Would I rather have 16X pixels than FullHD, but same render quality, or is FullHD enough pixels if you put 16X more effort into calculating each pixel? Maybe just add fancier rendering techniques, more geometry and whatever, the extra pixels don't really help that much.
  • I got my first hard drive for Apple 2 in the early 80's. The salesman said I would never be able to fill up its massive 5MB hard disk. I was also told I would never need more then 2400 baud. 8K may not be a thing today. The downloads would be insane and few people could run it now anyway. That being says technology changes. The internet connections will get faster. AT&T is already rolling out 2gb Internet. In 5 years that may be slow and 8k will have a larger following and new GPU tech will enabl

  • Porn shall lead the way, as usual. Though if you're looking for purposefully wrinkly skin, skin smooth filters make this resolution pointless.

  • If I was the one choosing where to throw silicon and algorithms, give me rendering closer to reality at 4K than something less at 8K. When I see Lara Croft or the male equivalent, I want to believe.
  • If you actually look at the pixels per inch on a 65" 1080p, 65" 4k, 65" 8k - you'll see just how close to need to sit, to gain the benefits.

    I, speculate that a 4k television, with a /full/ 4k resolution game being pushed out to it, would not see benefits of going to a higher resolution until you exceed over 100-130" of size.

    I have a 1080p 65" set, I set quite close to it and I can /just/ discern pixels with my glasses on and I mean /just/. I may see some genuine benefit moving to 4k at the same size, not

    • by gweihir ( 88907 )

      There never will be 8k games. Humans stop being able to see defective pixels at 4k. There is no benefit to higher resolutions after that.

      • I mean if the display is in excess of 150" it's possible. Unlikely but possible.

        • by Junta ( 36770 )

          It depends on how close. At 6 feet away a 100" screen will be indistinguishable.

          At inches away with lenses to adjust the focus to actually be in focus, you could in theory tell the difference on a 3" screen. Though probably not that much of a difference even then.

  • The Xbox One at launch was being positioned as a console that offered 4k gaming [polygon.com] but in reality it could barely do 1080p games.

    "Xbox One supports both 3-D and 4K," [Microsoft's Larry] Hryb said in response to a question about the console including these features.

    [Yusuf] Mehdi also revealed the console will support 4K for Blu-ray at launch, with the possibility of games and other content being available at 4K in the future if they are rendered at that resolution.

    In fact through its entire existence the launch

Time is the most valuable thing a man can spend. -- Theophrastus

Working...