Lag, screen freezes, pixellation and all the same problems Google has been having with Stadia....
They need to get it into their heads that we are nowhere near capable of delivering the data bandwidth required per frame, frame rate AND the response time CONSISTENTLY in order to provide a reasonable gaming experience. If you're playing some multiplayer game and a player has lag for a moment and does a bit of rubber banding, it's no big deal. But if you're "immersed" in a game and suddenly your frame rate dr
Complete nonsense. I've been streaming 1080p games for years now.
The bandwidth require per-frame isn't even a metric that means anything here, since streaming happens with either h.264 or h.265, and both use temporal compression.
I can tell by your post that you haven't actually tried it. So why do you judge it as if you had?
If I'm not mistaken these compression protocols depend very much on large sections of the screen remaining unmodified from one frame to the next, and only modify the part that has changed, conserving bandwidth. Now if you had any idea what new techs like DOTS will mean for future games where you can literally have almost everything on the screen moving without reducing frame rates at all, these protocols will not be as useful.
If I'm not mistaken these compression protocols depend very much on large sections of the screen remaining unmodified from one frame to the next
You are not mistaken.
Now if you had any idea what new techs like DOTS will mean for future games where you can literally have almost everything on the screen moving without reducing frame rates at all, these protocols will not be as useful.
That's fucking nonsense.
Increased processing power (fucking hopefully) does not mean your game looks closer and closer to white noise.
You are correct, they don't compress white noise for shit.
But you're insane to think that frame-per-frame will ever be anything but nearly the same.
Yeah but there are real limits. The human eye can perceive changes and stutters in frame rate - in fact changes in motion are instantly more noticeable than changes in color or intensity - it's the way we're wired as part of our survival. This is even more noticeable if you're actually participating in a "simulated" fight or flight situation versus passively watching a video about ants [youtube.com].
In highly demanding graphical environments - which is after all what they CLAIM to be replacing because a run of the mill
we might soon be reaching levels were either the compression algorithm is going to have to force too much loss of info/detail
No. Just no.
The loss of detail is based on the bandwidth. The more bandwidth you allocate to the stream, the more frame-divergent detail you can pack in there.
Currently, I can play 1080p60 games using around 30mbps of bandwidth. As bandwidths increase, it will get *easier* to accommodate this. 4K streaming is a thing on youtube.
I'm not arguing such a level doesn't exist, but not for practical human purposes.
You seem to be arguing for the end of all internet video recordings of games due to some kind of weird convergence of games into something that models as white noise.
This is, again, nonsense.
Is a streamed temporal compression algorithm inferior to directly rasterized frames? Of course.
Not enough for you to care though, that I assure you.
Let me guess (Score:2)
Lag, screen freezes, pixellation and all the same problems Google has been having with Stadia....
They need to get it into their heads that we are nowhere near capable of delivering the data bandwidth required per frame, frame rate AND the response time CONSISTENTLY in order to provide a reasonable gaming experience. If you're playing some multiplayer game and a player has lag for a moment and does a bit of rubber banding, it's no big deal. But if you're "immersed" in a game and suddenly your frame rate dr
Re: (Score:2)
The bandwidth require per-frame isn't even a metric that means anything here, since streaming happens with either h.264 or h.265, and both use temporal compression.
I can tell by your post that you haven't actually tried it. So why do you judge it as if you had?
Re: (Score:2)
Re: (Score:2)
If I'm not mistaken these compression protocols depend very much on large sections of the screen remaining unmodified from one frame to the next
You are not mistaken.
Now if you had any idea what new techs like DOTS will mean for future games where you can literally have almost everything on the screen moving without reducing frame rates at all, these protocols will not be as useful.
That's fucking nonsense.
Increased processing power (fucking hopefully) does not mean your game looks closer and closer to white noise.
You are correct, they don't compress white noise for shit.
But you're insane to think that frame-per-frame will ever be anything but nearly the same.
Re: (Score:2)
nearly the same.
Yeah but there are real limits. The human eye can perceive changes and stutters in frame rate - in fact changes in motion are instantly more noticeable than changes in color or intensity - it's the way we're wired as part of our survival. This is even more noticeable if you're actually participating in a "simulated" fight or flight situation versus passively watching a video about ants [youtube.com].
In highly demanding graphical environments - which is after all what they CLAIM to be replacing because a run of the mill
Re:Let me guess (Score:2)
we might soon be reaching levels were either the compression algorithm is going to have to force too much loss of info/detail
No. Just no.
The loss of detail is based on the bandwidth. The more bandwidth you allocate to the stream, the more frame-divergent detail you can pack in there.
Currently, I can play 1080p60 games using around 30mbps of bandwidth. As bandwidths increase, it will get *easier* to accommodate this. 4K streaming is a thing on youtube.
I'm not arguing such a level doesn't exist, but not for practical human purposes.
You seem to be arguing for the end of all internet video recordings of games due to some kind of weird convergence of games into something that models as white noise.
This is, again, nonsense.
Is a streamed temporal compression algorithm inferior to directly rasterized frames? Of course.
Not enough for you to care though, that I assure you.