A Proposal To Fix the Full-Screen X11 Window Mess 358
jones_supa writes "The SDL developers Ryan Gordon and Sam Lantinga have proposed a window manager change to work out the full-screen X11 window mess, primarily for games. The proposal is to come up with a _NET_WM_STATE_FULLSCREEN_EXCLUSIVE window manager hint that works out the shortcomings of the full-screen hint used currently by most games, _NET_WM_STATE_FULLSCREEN. Ryan and Sam have already worked out an initial patch for SDL but they haven't tried hooking it to any window manager yet. Those interested in the details, information is available from this mailing list message. One of the key changes is that software would make the request to the window manager to change the resolution, rather than tapping RandR or XVidMode directly. Martin Gräßlin of KDE was rather wary about the patch and said that games changing the resolution just tend to mess up the desktop."
Seems like a reasonable idea, given a bit of time to mature as a spec. In KDE's case, a separate daemon from the window manager handles resolution changes so going through the WM would add complexity, and the plasma shell still has no way to realize that it shouldn't reflow the desktop widgets. Setting window properties seems like a sensible IPC method for communicating intent though (without making yet another aspect of the X desktop reliant upon the not-very-network-transparent dbus): "hey, I need to resize, but just for me so don't reshuffle the desktop and docks."
Hilarious excuses (Score:5, Insightful)
So, ugh... fix your desktop?
Re: (Score:3)
Agreed. Why can't the plasma widgets just save their positions and change back when the resolution changes back?
Re:Hilarious excuses (Score:5, Insightful)
The desktop doesn't know what caused the changes, so you could run into a lot of strange issues. Imagine you lay out your desktop on a 30" 2560x1440 monitor, then switch to a 1920x1080 monitor and added/removed/moved an icon. What happens when you reattach the first monitor, should everything just "snap back" to the places it had - even if you'd arranged your icons completely differently now? To me the solution outlined here seems much smarter, just let the game have it's own "screen" with its own settings, no need to even tell the other windows about it.
Re:Hilarious excuses (Score:4, Informative)
This is the exact purpose of this proposal, to create a new signal that would tell the window manager that the change is temporary and only takes effect while a specific window has focus. This way they window manager would know there's no need even to move the icons in the first place.
Re: (Score:2)
Re: (Score:2)
Ever hear the story about the straw that broke the camel's back?
As I see it, with the way things stand and the direction in which they seem to be going, we generally need leaner (simpler, perhaps more clever) solutions to problems, not heavier ones.
Even if it's just a tiny bit of additional overhead. Operating systems are slow enough, these days, almost as if nobody even tries to optimize anything anymore with a gran
Re:Hilarious excuses (Score:5, Interesting)
Why don't games just spawn a separate X11 window server instance with a different resolution on a separate VC? Adding proper resource sharing between X11 instances seems like it would be a lot easier to do than rearchitecting all the existing apps to do the right thing during a temporary resolution change.
And there's no benefit to a full-screen app running in the same X11 instance as any other app other than making it possible to transition a window from being a normal window to a full screen window and back, and with a resolution change, that won't work very well anyway, which makes even that argument mostly moot.
Re:Hilarious excuses (Score:5, Insightful)
Why don't games just spawn a separate X11 window server instance with a different resolution on a separate VC? Adding proper resource sharing between X11 instances seems like it would be a lot easier to do than rearchitecting all the existing apps to do the right thing during a temporary resolution change.
And there's no benefit to a full-screen app running in the same X11 instance as any other app other than making it possible to transition a window from being a normal window to a full screen window and back, and with a resolution change, that won't work very well anyway, which makes even that argument mostly moot.
Why the hell should the user suffer with resource expansion taken up by X because the damn paradigm is a big pile of hurt that goes back to the early days? I remember all the arrogance of X windows during NeXT's days and decisions with Display Postscript. It's rather clear the NeXT design has always been superior and OS X benefits from it.
Comment removed (Score:5, Insightful)
Re:Hilarious excuses (Score:5, Insightful)
I mean c'mon guys, X11 has had a good run but it should probably be in the same group as Gopher and Telnet,
Aw geez not this crap again.
Why whenever anything new comes up do a shrill group of people start shreiking omg omg x11 is so old omg omg scrap it omg omg we can't possibly make a minor tweak to fix a minor problem omg omg omg legacy omg omg omg bloat ong oh the legacy omg won't someone please THINK OF THE CHILDREN omg legacy.
Without ever stopping to *THINK*.
Just stop and think. Not about X11, but about any GUI system.
The GUI runs at the monitor's maximum resolution. Things like windows are spread out over the whole area, as perhaps are icons, widgets etc.
If the user reduces resolution, a common thing to do is to move all the windows into the new area, otherwise they may become inaccessible.
So far so good. Nothing specific about X11 in there.
Any good system will have a protocol or API for changing resolutions so 3rd parts resolution changing programs are possible to write.
So far so good.
But in some cases you don't want to rearrange the windows because the resolution change is temporary, so you need to have an extra flag which tells the system that it's temporary and not to bother.
OK, still nothing about X11 in there.
Now this is a proposal to add such a flag using a mechanism for adding such flags which has been standardised since 1985. And it will work smoothly and be completely backwards compatible.
IOW the design of X11 is ideal for this kind of change and it shows how solid the underlying design is.
Nothing breaks. No need to have a ChangeResolution and ChangeResolution2 API, no need to deprecate the old API no need to break anything.
Seriously if you scrapped the entire GUI and rendering system whenever a minor tewak is needed you'd never get anywhere.
Re: (Score:3)
Linux people apparently like making things hard.
Now you're just making stuff up.
Every game i play runs at native, including some titles from the 2003 era.
Well that's nice for you. My netbook certainly can't push much on a 2560x1600 monitor (maximum supported resolution). More realistically, it struggles to do much above video playback at 1080p.
Re: (Score:2, Insightful)
Indeed, he didn't even realize this flag wouldn't even tell the widgets the resolution changed so they would never be rearranged for starters. I doubt he has even read the proposed spec.
Re: (Score:3)
Except that we're no longer in the era of CRTs. Since LCDs have one native resolution, they should always be driven at that resolution. If a game wants to run at 640x480, then that should be accomplished by scaling it up and adding black bars, if necessary, but the signal to the LCD should still be at the original resolution.
Re: (Score:3)
Re:Hilarious excuses (Score:4, Insightful)
It's not that I don't trust the LCD. It's that when you change the resolution, you tend to screw other things up as well.
I have three monitors and I game on the center one. I like to keep my email and IRC open on the other ones while I play. But if the game adjusts the resolution, the positions of the other windows move around and I can no longer see all of them. This happens in Windows if the game doesn't run at the same resolution as my center monitor.
Re: (Score:2)
or even multiples/divisors of that resolution, ideally exposed via EDID. since they're even, the screen can do a simple, lossless point sample scale which is computationally simple (compared to common bilinear) and allow these low resolutions to be full screen with no added latency (scaler chips inside most panels are sloow). These are needed because desktops might be 2560x1600 but most gpus won't run games well at that resolution.
nvidia's windows drivers support scaling in the gpu too, but unfortunately
Re: (Score:2, Insightful)
Why do game developers always assume that my computer doesn't have any other purpose except to play their game? I've got other stuff on this computer -- stuff that is more important than the games. My computer is my alarm clock, my calendar, and a communication tool, among other things. Games had darned well better stay in the window I put them in, or I won't be playing them.
Because not everybody wants to be annoyed by the rest of the UI when playing a game. Of course, when fullscreen is available it should be an option (and not the only way to play the game), but that isn't an excuse to completely get rid of it.
Re: (Score:2, Insightful)
Re: (Score:3)
That's crazy talk! We can't do that! It's impossi-
(Assuming it wasn't a boot-from-floppy game, that is.)
Games are the problem? (Score:2, Insightful)
Just start the goddamn games on a totally different TTY. There, problem solved!
Re: (Score:2, Redundant)
Just start the goddamn games on a totally different TTY. There, problem solved!
That's what I do to play games.
I usually just switch over to TTY1. Then I can load TREK73.BAS:
Re:Games are the problem? (Score:4, Insightful)
Not what I heard. Sounded more like "why the heck are you making the problem more complicated than it needs to be?
Ideally, I am not sure why the heck a game of the full-screen sort would need X11 to begin with. Perhaps for portability. Wouldnt want to try and run games over remote X either, so why?
Assuming there are nonetheless plenty of reasons in practice to want to make that work (starting with 'lots of existing games that do require it' of course) then why not just set them to start their own exclusive server instance, tuned for that purpose?
If it's a game that's supposed to be running full screen and not interacting with a desktop, why then force a desktop to be part of the environment at all? Keep it simple.
Re: (Score:3)
Music to my ears! (Score:5, Interesting)
With Linux finally becoming a more "proper" gaming platform (i.e. Steam and others), it's "about time" that this is dealt with. _NET_WM_STATE_FULLSCREEN_EXCLUSIVE, where have you been my whole adult life? Gotta hand it to Ryan Gordon ("Icculus," as I recall) for consistently making Linux gaming that much more viable.
Re: (Score:2)
I didn't see any information in the article, but what exactly is the problem with X11 full screen support? I don't game in Linux, and this is the first time I've even heard of this.
Re:Music to my ears! (Score:5, Informative)
I didn't see any information in the article, but what exactly is the problem with X11 full screen support? I don't game in Linux, and this is the first time I've even heard of this.
The biggest issue is that when the game goes full-screen, it changes the resolution to whatever the game is set to--which may or may not be what you keep your desktop at. Then, when you exit the game, the icons are usually huge; the taskbars are usually all messed-up (even when "locked!"), and you have to futz around to make it usable again. Also, many games on Linux won't even let you Alt-Tab to other windows! Either nothing happens; or the resolution won't be correct; or the game crashes. It's really unpleasant to deal with. Also, it's worth noting that many games (especially Linux games, sadly) are extremely limited about what resolutions they'll let you use--so even if you want to set the game to your native resolution, it might not work or let you even try.
Re:Music to my ears! (Score:5, Informative)
The article contained a lot of detail. The current mechanism is a two-step thing where the application first requests full-screen control from the WM. The WM then resizes the window to fit the current screen (which may not make the game happy), removes decorations, and then gets out of the way. Then the game changes the resolution and resizes the window again. The resolution change notification is delivered to the WM, which then propagates it to all of the applications, so if you want to play a fullscreen game at 640x480 then you may find that all of your application windows have resized themselves to fit in this screen when you exit it. The game then runs (hopefully) and if it crashes then in theory the WM is responsible for restoring the video mode, but in practice it doesn't know for certain that the game was responsible for changing it, so it may not.
With the new proposal, the game resizes its window to the resolution it wants and then requests full screen mode. The WM should then change the screen resolution to the closest to the current window size and doesn't propagate the resolution change notification to any other applications. This means that nothing else gets resized. If the game crashes, or if you do any kind of switching out of the game, then the desktop resolution will be restored.
And, while it's fashionable to hate X11, it's worth noting that Windows and OS X both suffer from some of the problems that this proposal is intended to fix.
CRT's (Score:4, Insightful)
Who is still running a CRT? Who wants any program to change the resolution of their screen?
This strikes me as the wrong solution to the problem: A program should instead request the "current monitor's resolution" (because there can be more than one!) set its display area to that size, and then tell the window manager to "fullscreen" it by removing title bar and border decorations and moving it to (0,0) of that monitor. But NEVER EVER RESIZE MY MONITORS. Thank you. The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.
Re: (Score:3)
I agree, I have no idea why game windows are not handled better.
It is basically impossible to run many, quite possibly most, games in a window. And even the ones that do allow it often require editing of files or hacking the exe.
Theoretically the OS should be being sent this visual data and no matter how it was programed you would resize it/run it in a window.
Re: (Score:2)
Yes, theoretically, but in reality resizing stuff on the fly, particularly to odd, one-off resolutions to fit in a window, is a big performance sink - fine for some games but there is a good chunk of the market where that isnt acceptable. Plus, for many games, moving the mouse to the edge of the screen actually has a specific meaning. It's not always straightforward to determine whether you mean to throw the mouse against the edge of the screen in-game or just mean to move the mouse out of the window to che
Re:CRT's (Score:5, Insightful)
Who wants any program to change the resolution of their screen?
Someone whose graphics card isn't up to the task of running a game at full native resolution? That'd be my guess anyway; I haven't willingly used a lower resolution for a while. (Some games don't support high resolutions, or don't support widescreen resolutions, and there it's "reasonable" that they change it as well. But a program like that probably wouldn't use that in the first place, so whatever.)
The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.
I don't know enough about this proposal to say how it interacts with this (indeed, I'm rather disappointed by both the summary and TFA not actually, you know, saying what the problems are in the first place), but there's absolutely no reason why those goals are in conflict. In fact, the proposal specifically addresses this: "If the window loses input focus while fullscreen, the Window Manager MUST revert the resolution change and iconify the window until it regains input focus. The Window Manager MUST protect desktop state (icon positions, geometry of other windows, etc) during resolution change, so that the state will be unchanged when the window ceases to be marked as fullscreen."
Re:CRT's (Score:4, Informative)
For the myriad of responses that brought up this point: the answer is video card hardware scaling. E.g. add a flag _NET_WM_STATE_SCALING_ALLOWED which directs the WM to use hardware scaling from a fixed-size framebuffer, as is done by video players. Not only can you make it full screen, but you can resize it to any arbitrary shape and size (e.g. don't cover your widget bar, etc). Then the Window Manager decides what is "fullscreen". It could even make an app span more than one monitor when "fullscreen", or just one.
GPU scaling on the Xbox 360 (Score:4, Interesting)
For the myriad of responses that brought up this point: the answer is video card hardware scaling.
And this is exactly the solution that the Xbox 360 uses. A lot of newer games are fill rate limited. Because of the complexity of the pixel shaders that games use, the AMD Xenos integrated GPU in the Xbox 360 (similar to a Radeon X1800) can't run it with an acceptable frame rate at any resolution over 1024x576. So games use the Xenos's scaler to turn 1024x576 into 1280x720 or 1920x1080 pixels for the component or HDMI output.
Re:CRT's (Score:5, Insightful)
This. I came here to say the same thing, but you already had. Every single modern graphics card is very efficient at scaling textures, and in fact, LCD scaling these days most often ends up happening on the GPU anyway. Don't touch my screen resolution. Ever. If the goal is to get better performance by rendering at a lower resolution, then render at a lower-resolution offscreen buffer and scale that up to the screen resolution.
I wish Wine had a mode that did this for Windows games that expect to change the screen resolution and don't play well with Xinerama. These days I end up using the "virtual desktop" wine mode with per-game settings and KDE's window override support to put it on the right display head and remove the borders, but it's a suboptimal manual solution. The Linux game situation is slightly better (they tend to be configurable to respect the current resolution and usually get the display head right), but still don't have scaling support.
Need inspiration? Do what video players (particularly mplayer) do. That is how fullscreen games should work.
Re: rendering lower then scaling up to native (Score:5, Interesting)
Re: rendering lower then scaling up to native (Score:4, Informative)
This works - but wastes both ram space and performance.
Re: rendering lower then scaling up to native (Score:5, Interesting)
Exceedingly little, though.
Modern games render to more than one off-screen buffers already (necessitated by HDR, deferred shading, and other fun things), only blitting and gamma-correcting the final bits to the screen's framebuffer at the very end.
The tiny amount of RAM occupied by the 8-bit framebuffer to accommodate a large screen resolution is dwarfed by these several framebuffers, some which will use 16-bit components.
The amount of GPU needed to draw a solid full-screen quad really is too trivial to care about.
Re: (Score:2)
I want to change the resolution and I will tell you why. On 32" monitor, I can't read the text unless it runs in 720P of even 800x600.
Resolution independence (Score:3)
I want to change the resolution and I will tell you why. On 32" monitor, I can't read the text unless it runs in 720P of even 800x600.
Actually, unless you have become totally inured to blocky, pixelated displays what you really want is for everything to be rendered larger.
Fortunately, many operating systems support resolution independence [wikipedia.org], which would allow you to keep your display at its high, native resolution and still draw your widgets, text, etc at a large size. This is done by changing the DPI hint in the OS so that it knows to render things larger (or smaller).
This approach would accomplish the overall effect you desire while avoi
Re: (Score:3)
Who is still running a CRT? Who wants any program to change the resolution of their screen?
Gamers often do. An average application might run nicely at a high resolution, but for a smooth Skyrim experience, many people may find it necessary to allow it to run at a lower resolution.
Re: (Score:3)
Re: (Score:2)
um.. why would you want a blurry 640x480 stretched game? In addition to looking like shit, it would consume lots of extra gpu resources. better to run the game in 960x540 (or some other even multiple) and have X11/gpu scale it up losslessly.
Re: (Score:2)
Uh, wayland isn't a composting window manager, its a display server protocol. Kind of like X11.
Re:CRT's (Score:5, Interesting)
This is not a CRT-only problem.
Gamers. [hardocp.com]
Not surprising, since you're ignoring the underlying problem. Your 2560x1600 desktop on that 30" LCD is going to kill the ability of your videocard to display a modern game at an acceptable frame rate. Many gamers will not accept windowed half-screen (or whatever fraction is required) gaming on their $1K LCD.
No. Windows and OSX have figured this out. Linux window managers (at least one popular one) need to as well.
Irrelevant to your desired scheme, where keyboard hotkeys would still be required. In Windows and OSX you can still task switch, move to another desktop, etc. using such hotkeys. Yet the game controls the resolution of the monitor in fullscreen mode.
Re:CRT's (Score:4, Insightful)
No, you are missing his point. There is no reason the game could not run at a lower resolution and be scaled by the WM, instead relying on the screen to do the rescaling. Only CRTs are able to do rescaling physically, LCDs end up doing it in software anyway and usually in a crappier maner than what the WM could do.
Re: (Score:2)
Unless your game uses OpenGL and you have a fully accelerated driver (read: the proprietary Catalyst or nVidia blob), it will not be able to scale fast enough. Most games use SDL and main memory surfaces that are then blitted to the screen. Any scaling is done in software by the CPU and is dreadfully slow. My Core i7 can handle 1680x1050@60, but just barely, with one core pegged to 100%. The cheapest GPU, of course, can handle that easily, but you must run proprietary drivers and use OpenGL. If you don't, r
Re: (Score:2)
But LCDs scale for free, while CPUs and GPUs do not.
Why reinvent the wheel?
Re: (Score:2)
[[citation needed]]
Re: (Score:2)
Not me, but I want to but they are impossible to find new quality ones. Anyways, I still use low resolutions like old games, MAME, demos, etc. I still use old KVM from Y2K that use VGA so not changing resolutions and keeping black bars doesn't work. :(
Re: (Score:2)
Re: (Score:2)
some of us have underpowered graphics cards, or, god forbid IGPs, that can't play games at full rez, and need to downscale.
sorry, i'll go dig myself a hole and jump in it now, i'm obviously not worthy.
Re:CRT's (Score:4, Insightful)
> This strikes me as the wrong solution to the problem: .. set its display area to that size
*sigh*
It is sad to see you unfortunately don't know what you the hell you are talking about. Let me explain:
There are these precious little things called RAM, Video Memory, Video Buffers, DMA speed, and Scaling.
Games typically use _at least_ *four* buffers:
* Frame Buffer Front (32-bit)
* Frame Buffer Back (32-bit)
* Depth Buffer (24-bit)
* Stencil Buffer (8-bit)
Why should the Window Manager force the app to *over* allocate memory say @ 1920x1080 when the user has selected 640x480??
That is, why would you have the GPU waste a total of 24 MB (8 MB FB Front + 8 MB Back + 6 MB Depth + 2 MB Stencil) compared to ONLY needing 3.6 MB (1200K + 1200K + 900K + 300K) ??
More memory allocated for the buffers means you have LESS resident textures on the GPU.
Also, By using a native (lower) resolution you force the monitor to use native *hardware* up-scaling.
> and then tell the window manager to "fullscreen"
And this is done for "free" in your fantasy world??
Why would you force the GPU to do *extra* work of texture-copy up-scaling when it doesn't need to one in the first place if you are running at a 1:1 resolution at full-screen??
> set its display area to that size, and then tell the window manager to "fullscreen" it by removing title bar and border decorations and moving it to (0,0) of that monitor.
That is called "Windowed No Border Mode"
i.e. Gamers want _3_ choices
* Full-Screen (change resolution)
* Windowed (don't change resolution)
* Windowed No Border (don't change resolution)
Lastly SOME games do NOT support arbitrary resolutions. I *want* them to fill my 22" monitor at whatever resolution they DO support. The *fonts* and the rest of the UI elements are BIGGER and easier to see when running in full-screen mode.
Likewise, games that *only* run in full-screen mode are BADLY DESIGNED.
The *proper* thing to do is to give the _user_ choice: Namely the 3 I listed above.
Hope this helps explains some of the issues and WHY this solution is The Right Thing.
Re:CRT's (Score:5, Insightful)
This is a very nice example of what is wrong with Linux. Not the actual problem. The attitude towards the problem and the users who experience it.
Re: (Score:2)
Every heard of 2880×1800 retina displays? Like to play your games at 60FPS? Well, as someone rocking one on a 15" monitor with a mid-low end GPU, I frequently run into this issue under Linux. And let me tell you the current system is terrible. Mac OS can do it, Windows can do it, If Linux wants to get competitive, they need to fix this issue. Just a few days ago, I fired up Tux Racer for a friend to play. I think I had to reboot after that fiasco.
I don't care how it's resolved, different TTY or _NET_WM
Re: (Score:2)
Because many games only come at a fixed resolution. Plus many screens have such an enormous resolution that games are impractical at that scale.
Re: (Score:2)
It's certainly a worthy goal to never need to change the monitor mode. However, I don't think we're quite there yet. Most games that rely on 3D acceleration cannot maintain the maximum frame rate at the maximum resolution supported by the monitor. Therefore, users need to be able to choose resolution to tune the game to their machine and preferences. Once frame rate is truly independent of mode, there should never be a need to reduce resolution.
Re: (Score:2)
in fact generally this approach leads to better visuals than LCD rescalers
Citation needed.
What difference does it make who (the graphics card or the monitor) is doing the scaling??
Re: (Score:2)
It doesn't matter who, but it does matter what algorithm they're using. Monitors aren't guaranteed to be using the algorithm that produces the best results.
Re: (Score:2)
If your talking about rendered 3D the vid card can always do a better job it has more information to work with. That's not to say that it will.
Re: (Score:3)
Three big differences come to mind:
Re: (Score:2)
Exactly.
Cheap LCD don't scale properly.
Re:CRT's -- Simple fix. (Score:2, Funny)
Force ALL games to run at 640x480 -- problem solved.
Except for those i386 Linux systems who are trying to run Half Life 2 .. perhaps we should lower that resolution to 320x240, just to guarantee we're not butting heads with the window manager. After all, the first goal of every Linux game designer should be to ensure the tail log window you're running is properly proportioned at all times.
Re: (Score:3)
Then buy a better video card or run it windowed.
This full screen nonsense is something you flee from Windows to get away from. The idea that it is being dragged back into Linux is just annoying.
It's 2012. It's long past time that Game programmers realized that they don't get to run amok with the system.
It's 2012 and a modern OS, not an Amiga.
Re: (Score:2)
You missed the point. LCDs don't have different resolutions, they have one resolution and only one resolution.
Re: (Score:2)
Re: (Score:2)
Your second reason is stupid. Just because Windows and OSX still sort of do it that way doesn't mean it actually makes sense that you should have to futz with the resolution just to make widgets use more or less screen real estate for better viewing. Window managers should handle scaling of UI elements and text sanely.
If I want bigger text to be easier to read, I still want crisp text. If I want smaller text to have more stuff on the screen, I don't want the letters to all run together like a censor bar.
No assistance required (Score:3, Funny)
Martin Gräßlin of KDE was rather wary about the patch and said that games changing the resolution just tend to mess up the desktop
KDE doesn't need the help.
Then will it be year of the Linux desktop? (Score:2)
So another ten years? Seriously, this is well past due. This is the second story about someone wanting to fix the desktop in the last month or so. Hopefully if there are enough one of them might actually gain traction. Here is hoping. The X system really is a heap. As much as the purists like to bitch about it, thank goodness for nvidia when it comes to multiple monitor support. Too bad it doesn't help the gaming thoug
Re: (Score:3, Insightful)
I'm pretty sure somebody did go in and fix the X11 desktop..... It was Apple w/ OSX.
Re: (Score:2)
I'm pretty sure somebody did go in and fix the X11 desktop..... It was Apple w/ OSX.
Do you mean DisplayPostScript from NeXT? Apple never used X11 for their primary display (ignoring AU/X and mkLinux for the sake of convenience here) so they had nothing to fix.
Dump X (Score:5, Insightful)
I still think X needs to go. For truely forward thinking, it needs to be replaced. Just look at Andriod. Andriod would not be useful if it was forced to use X.
Frankly, X does too many things that too few people need. It was designed for a different era and it scales to modern workflows very clumsily. Multi-monitor desktops and gaming on windows is effortless. On X it's frankly a chore.
Sorry, no, network transperancy is not an important feature anymore. Probalby implemented by .001% of regular users. VNC/RDP style remote access is the way it's done now. An no, nobody cares if it's tehnically inferior. It's hundreds of times easier to implment and use.
Modern toolkits pretty much ignore 95% of X's built in features an just pass bitmaps.
Yeah, X has lots of cool things but you have to realize most of them are impractical or unnecessary. Today we really have the memory, computational power, and bandwith to bang whatever we want on to the screen with out any trouble. The latency and overhead X present are the enemies today.
Now stop. - Yes you, stop. I know you're about to type up a 10 paragraph screed about how you ported X ap to obscure platform Y, or remotely managed 10,000 servers with facial twitches and GTK. Just stop. Your use case does not represent the vast majority of computer users. It doesn't even represent a full fraction of a percent.
Legacy baggage and clinging to old ideas spawned x.org. The same thing is what will spawn whatever is to replace X.
Re: (Score:3, Insightful)
This is a lot of FUD.
Android. Look at the N9 with an award winning UI. It uses X and is really cool (on outdated hardware).
Network transparency is really useful. VNC/RDP sucks compared to X. And I don't see how it is
easier to use than X. Maybe there are more GUI for it that make it easier for beginners, but that
is not for any technical reasons.
I don't see what overhead X causes. I worked fine decades ago. Latency is an issue over the network,
but only because the toolkits never cared about that. It's not a p
Re: (Score:3)
I agree that we need to come up with a brand new system to handle today's graphics systems. That's what Wayland is for, and why it's such an interesting project. It is not legacy baggage, but a ground up designed system. You have heard of it, haven't you? Seems like every Linux user and their dog knows about it these days.
Also, I'm very glad that Wayland is implementing an X compatibility layer. I'm one of those fraction of a percent that use and enjoy network transparency. It would annoy the hell out
Re: (Score:2, Informative)
Wayland has an X compatibility layer, sure, but you may also be pleased to know that there are efforts underway to get native Wayland network transparency.
See, the cool thing about wayland is the protocol is written to be flexible enough to have some other program handle the network transparency part, seamlessly. It's not part of the core design of wayland simply because it doesn't have to be in the core at all.
An added bonus of this flexibility is the ability to do network-tranpsarency things that even X c
Re: (Score:2)
That ... is awesome news! Thanks for that information. I hadn't heard anything about it.
The X compatibility layer will still be useful regardless, but I'm very glad to hear that Wayland will likely have network transparency as well. The more I hear about this system, the more I'm liking it.
Re: (Score:2)
The feature I was referring to was demo'd here [youtu.be], where the presenter forwards a window from one display to the other, ending up with the same window on two displays. This is local in the demo, but he says that it's transferring graphics data over the network and I have no reason to not believe him. This presentation is from last month, almost a year after the blog post you linked.
Also, X can move windows from screen to another without a problem. Most toolkits don't support it, but there has been extensions/patches around for a while.
I was not aware that this was a feature of X, but I would like to point out that this is not an *extension* of Wayland doing this,
Re: (Score:3)
If you ever want to see the 'year of the Linux desktop', we need to ditch this technically-superior but useless to most mentality and just do something that *works*.
If you ditch the technically-superior bit in order to play games more easily, you'll have a cheap copy of Windows on the desktop, not Linux.
If that's what you want, it's already available at piratebay. (Or what the heck, even a legally obtained copy of Windows is "free" for most people.)
Re: (Score:3)
Wayland has an X11 compatibility mode. So in what sense is that a "go to hell message"?
Have you ever used an "x11 compatibility layer" on e.g. OSX or Windows.
They suck, because the integration between X and non X sucks. They suck because you cant use an X11 window manager to manage native windows. They suck because native windows can't be remoted using X11.
Basically it makes X11 programs bastard red headed stepchildren and doesn't work nearly as well as using a single system.
The two modern versions of *nix,
deeply technical (Score:2)
Re: (Score:3)
The problem is that there are no semantics in X that allow a program to change the screen resolution while NOT causing those other programs to do stuff.
This new flag is to signal these semantics. "Hey, we are changing the resolution, but we have this new idea called Exclusive Control over the display, so nobody nee
Re: (Score:2)
Re: (Score:2)
I dunno. If a game is running amok because gamers and game programmers suffer from an 80s mentality that a computer is a game console, then perhaps you don't want the rest of the GUI acknowledging this foolishness.
The fact that games on Linux don't scramble my desktop like they do under Windows IS ACTUALLY A GOOD THING.
Even with the status quo, cleaning up after a game run amok is less bothersome under Linux.
Make it a library! (Score:2)
Easy Fix (Score:2)
When a game starts, it wants the entire desktop, it doesn't want the other desktop elements at all, no dock, no icons, interaction, etc.
Why isn't there a function to create a new virtual desktop at any resolution you want and leave the other desktop untouched? So when you switch between them it knows to switch resolutions as well. Have the resolution tag part of the desktop, so when you switch between them it knows what to switch to.
Seems like an easy fix.
Re: (Score:2)
What else are you doing on your computer while you play that game?
Working, posting snarky comments on Slashdot, etc.
Re: (Score:2)
I'm replying to the "game wants the entire desktop so it takes it" comment. If I want to expand an app to the full screen, fine. That's my choice. But if I don't want to, then the developers should damned well figure out how to write 'well behaved' apps.
Sam Lantinga (from Loki Games) (Score:5, Informative)
I don't know if kids today remember, but Loki Games was one of the first commercial plays for big name games on Linux. Ended in tragic business troubles and financial doom.
It warms my heart to see that Sam Lantinga is still working on SDL.
That is all.
Re: (Score:3, Informative)
Re: (Score:2)
I have had the feed of the SDL Mercurial changelog [libsdl.org] on watch for a good while, from back when I felt I could make a game with SDL within a reasonably short time.
Times changed, assorted Shit Happened (both within and without my PC), and my SDL tinkery and SVG tinkery became Blender tinkery (short YouTube video of mine) [youtube.com] became "fuck this I'll just play some Torchlight and roughly build a witch there instead", but I still had the feed on watch and saw a relevant change [libsdl.org]. The summary had a different tone from th
Run a dedicated X-server (Score:4, Interesting)
Not sure what 'mess' is referred to in the title but I sidestepped the issues I met with Baldur's Gate by running it in its own X-server on a separate VT.
As I recall it just took a simple script to start the server and the game, and one extra command to make the mouse cursor less fugly. My main desktop remained completely undisturbed and just an Alt-F7 away. A little polish and this approach could be a good general solution, no?
A Clue! (Score:3)
It's stories like these... (Score:2)
...that make me realize I may not really belong here.
Re: (Score:2)
Switch to OS X or Windows and dump Linsux already.
Actually, I keep a Windows box to use for gaming. Linux works just fine for everything else.
(And would probably work just fine for gaming, if anyone would bother making games for it.)
Re: (Score:2)
>Linsux
Anyone who says this or "loonix" or a variation on this is just as bad as someone who spells Microsoft as Micro$oft and variations thereof.
--
BMO
Disclaimer: I had that habit but then I grew up.
Re: (Score:3)
We can all agree the X is a gigantic mess. It needs replaced by something better -- badly.
Maybe instead of everyone jumping in and telling us how bad X is, someone could take a minute to explain what's wrong with it for us non-technical types.
Re:Trying to solve the wrong problem (Score:5, Informative)
X is a very old protocol, with a lot of things that need to be implemented in order for something to say that it "speaks X". Things like font rendering and 2d drawing routines. Things that nobody actually uses anymore.
X used to be in charge of providing font rendering and such, but now libraries like Freetype and Cairo do it instead (and do it better). X used to be in charge of accelerated video, but now we have good OpenGL support and Kernel Mode Setting. The only thing X really does now is act as a proxy between window managers and applications. But X still has support for all the old stuff, and so it's huge and lumbering.
Wayland is a new protocol designed to be used between window managers and applications directly. In a new breed of window managers, the window manager itself will set up the screen and draw to it using kernel mode setting and opengl, and it will communicate and delegate windows to applications by talking the Wayland protocol. So there's nothing like the X server sitting between them anymore: the window manager runs directly on the console, and talks directly to applications.
This might sound like it would cause a lot of duplication of effort, and in one sense that's true. However, the amount boilerplate code needed to set up a simple Wayland-speaking window manager is about the same as the amount of boilerplate code needed to set up a simple X11 window manager. Except with the Wayland case, there's no huge X server sitting between applications and the screen, because Wayland is so simple the window managers can speak it directly.
One side effect of this is the Wayland library API is *much* simpler to use than the X libraries, so it becomes a lot easier to write new experimental window managers. I expect we'll see a lot of new WM's after Wayland becomes standard. Another plus is that Wayland has built-in support for multiple pointers and arbitrary window transformations, and that's an extremely nice touch for multi-touch screens.
Re: (Score:3)
These two files, window.c [freedesktop.org] and image.c [freedesktop.org], are an entire simple GUI toolkit and an example program using that toolkit, for use with a Wayland compositor.
This directory [freedesktop.org] is an entire compositing window manager that speaks the Wayland protocol. This is already impressively small, but keep in mind that most of the complexity here is in actually drawing to the screen and getting input events from hardware, something that Wayland has nothing to do with, and it's *still* small.
Wayland is simple because it is small, a
Re: (Score:3)
The "network transparency" objection is a red herring, and it's getting rather tiresome. We're not "losing" network transparency. First, we don't have network transparency now; when nearly every application depends on Xshm and direct rendering for anything resembling reasonable display performance, the fact that you can draw obsolete primitives on the server through X11 core rendering protocol requests is hardly relevant. Remote X11 apps have already been reduced to rendering their windows locally and sendi
Re: (Score:2)
When Wayland supports !linux, it can be considered.
Re: (Score:2)
Full screen presentations aren't needed
There are some users here that have a pathological hatred for maximized windows -- just hearing about full screen, to them, is like getting raped by Satan.
These guys don't believe you can do anything productive without at least 7 36" monitors, with 15 or so windows arranged in each.
I have little doubt that they'd try to give a presentation on their dedicated presentation monitor (one of those old, microscopic, 21" LCDs) in a window that takes up about 25% of the display.
Be careful. They tend to bite if the