Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics X Games Linux

A Proposal To Fix the Full-Screen X11 Window Mess 358

jones_supa writes "The SDL developers Ryan Gordon and Sam Lantinga have proposed a window manager change to work out the full-screen X11 window mess, primarily for games. The proposal is to come up with a _NET_WM_STATE_FULLSCREEN_EXCLUSIVE window manager hint that works out the shortcomings of the full-screen hint used currently by most games, _NET_WM_STATE_FULLSCREEN. Ryan and Sam have already worked out an initial patch for SDL but they haven't tried hooking it to any window manager yet. Those interested in the details, information is available from this mailing list message. One of the key changes is that software would make the request to the window manager to change the resolution, rather than tapping RandR or XVidMode directly. Martin Gräßlin of KDE was rather wary about the patch and said that games changing the resolution just tend to mess up the desktop." Seems like a reasonable idea, given a bit of time to mature as a spec. In KDE's case, a separate daemon from the window manager handles resolution changes so going through the WM would add complexity, and the plasma shell still has no way to realize that it shouldn't reflow the desktop widgets. Setting window properties seems like a sensible IPC method for communicating intent though (without making yet another aspect of the X desktop reliant upon the not-very-network-transparent dbus): "hey, I need to resize, but just for me so don't reshuffle the desktop and docks."
This discussion has been archived. No new comments can be posted.

A Proposal To Fix the Full-Screen X11 Window Mess

Comments Filter:
  • Music to my ears! (Score:5, Interesting)

    by DaneM ( 810927 ) on Thursday October 25, 2012 @08:33PM (#41772661)

    With Linux finally becoming a more "proper" gaming platform (i.e. Steam and others), it's "about time" that this is dealt with. _NET_WM_STATE_FULLSCREEN_EXCLUSIVE, where have you been my whole adult life? Gotta hand it to Ryan Gordon ("Icculus," as I recall) for consistently making Linux gaming that much more viable.

  • by brion ( 1316 ) on Thursday October 25, 2012 @08:53PM (#41772809) Homepage
    This is exactly how some games work on Mac OS X, for instance Source-based games like Portal and Half-Life 2. They don't muck with the actual screen resolution, but just render into an offscreen buffer at whatever resolution ant blit it stretched to the full screen. Switching from the game back to other apps doesn't disturb the desktop in any way. Would definitely love to see more Linux games using this technique.
  • Re:CRT's (Score:5, Interesting)

    by DRJlaw ( 946416 ) on Thursday October 25, 2012 @08:55PM (#41772835)

    Who is still running a CRT?

    This is not a CRT-only problem.

    Who wants any program to change the resolution of their screen?

    Gamers. [hardocp.com]

    This strikes me as the wrong solution to the problem:

    Not surprising, since you're ignoring the underlying problem. Your 2560x1600 desktop on that 30" LCD is going to kill the ability of your videocard to display a modern game at an acceptable frame rate. Many gamers will not accept windowed half-screen (or whatever fraction is required) gaming on their $1K LCD.

    A program should instead request the "current monitor's resolution" (because there can be more than one!) set its display area to that size, and then tell the window manager to "fullscreen" it by removing title bar and border decorations and moving it to (0,0) of that monitor. But NEVER EVER RESIZE MY MONITORS.

    No. Windows and OSX have figured this out. Linux window managers (at least one popular one) need to as well.

    The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.

    Irrelevant to your desired scheme, where keyboard hotkeys would still be required. In Windows and OSX you can still task switch, move to another desktop, etc. using such hotkeys. Yet the game controls the resolution of the monitor in fullscreen mode.

  • by smugfunt ( 8972 ) on Thursday October 25, 2012 @09:53PM (#41773243)

    Not sure what 'mess' is referred to in the title but I sidestepped the issues I met with Baldur's Gate by running it in its own X-server on a separate VT.
    As I recall it just took a simple script to start the server and the game, and one extra command to make the mouse cursor less fugly. My main desktop remained completely undisturbed and just an Alt-F7 away. A little polish and this approach could be a good general solution, no?

  • Re:Hilarious excuses (Score:5, Interesting)

    by dgatwood ( 11270 ) on Thursday October 25, 2012 @10:01PM (#41773283) Homepage Journal

    Why don't games just spawn a separate X11 window server instance with a different resolution on a separate VC? Adding proper resource sharing between X11 instances seems like it would be a lot easier to do than rearchitecting all the existing apps to do the right thing during a temporary resolution change.

    And there's no benefit to a full-screen app running in the same X11 instance as any other app other than making it possible to transition a window from being a normal window to a full screen window and back, and with a resolution change, that won't work very well anyway, which makes even that argument mostly moot.

  • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Friday October 26, 2012 @12:01AM (#41773911) Homepage

    Exceedingly little, though.

    Modern games render to more than one off-screen buffers already (necessitated by HDR, deferred shading, and other fun things), only blitting and gamma-correcting the final bits to the screen's framebuffer at the very end.

    The tiny amount of RAM occupied by the 8-bit framebuffer to accommodate a large screen resolution is dwarfed by these several framebuffers, some which will use 16-bit components.

    The amount of GPU needed to draw a solid full-screen quad really is too trivial to care about.

  • by tepples ( 727027 ) <tepples.gmail@com> on Friday October 26, 2012 @01:58AM (#41774479) Homepage Journal

    For the myriad of responses that brought up this point: the answer is video card hardware scaling.

    And this is exactly the solution that the Xbox 360 uses. A lot of newer games are fill rate limited. Because of the complexity of the pixel shaders that games use, the AMD Xenos integrated GPU in the Xbox 360 (similar to a Radeon X1800) can't run it with an acceptable frame rate at any resolution over 1024x576. So games use the Xenos's scaler to turn 1024x576 into 1280x720 or 1920x1080 pixels for the component or HDMI output.

  • Re:Hilarious excuses (Score:2, Interesting)

    by Anonymous Coward on Friday October 26, 2012 @03:56AM (#41774893)

    > Why don't games just spawn a separate X11 window server instance
    Because the X server may require root privilege, and making something as complex as an X server setuid root is a bad idea, so the only safe and reliable way to start an X server is via a root-owned daemon (i.e. display manager).
    Also: a bare X server isn't always useful. You may need other parts of the desktop environment e.g. to configure the keyboard (particularly if it isn't a US layout)
    An "exclusive full-screen" WM property is conceptually the right way to go. The push-back from the KDE guy is mostly due to KDE being unable to resize the screen without resizing the desktop. X itself doesn't have a problem with windows being either larger than the physical screen or larger than their parent.

  • by JDG1980 ( 2438906 ) on Friday October 26, 2012 @11:46AM (#41778981)

    Rather than try to cram modern features into the creaky old pile of bloat that is X11, that ancient technology needs to be relegated to server-only usage, and replaced with something modern that can talk more closely to the hardware without going through half a dozen abstraction layers.

    Yes, network transparency, I know. 99% of users don't give a shit, and just want things to display and animate smoothly – which X11 fails miserably at. Keep that on servers, where it matters, and drop it on desktops, where it doesn't. There is a reason why, when Google borrowed parts of Linux to make Android, they dropped the X11 layer without a second thought.

If all else fails, lower your standards.

Working...