Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics X Games Linux

A Proposal To Fix the Full-Screen X11 Window Mess 358

jones_supa writes "The SDL developers Ryan Gordon and Sam Lantinga have proposed a window manager change to work out the full-screen X11 window mess, primarily for games. The proposal is to come up with a _NET_WM_STATE_FULLSCREEN_EXCLUSIVE window manager hint that works out the shortcomings of the full-screen hint used currently by most games, _NET_WM_STATE_FULLSCREEN. Ryan and Sam have already worked out an initial patch for SDL but they haven't tried hooking it to any window manager yet. Those interested in the details, information is available from this mailing list message. One of the key changes is that software would make the request to the window manager to change the resolution, rather than tapping RandR or XVidMode directly. Martin Gräßlin of KDE was rather wary about the patch and said that games changing the resolution just tend to mess up the desktop." Seems like a reasonable idea, given a bit of time to mature as a spec. In KDE's case, a separate daemon from the window manager handles resolution changes so going through the WM would add complexity, and the plasma shell still has no way to realize that it shouldn't reflow the desktop widgets. Setting window properties seems like a sensible IPC method for communicating intent though (without making yet another aspect of the X desktop reliant upon the not-very-network-transparent dbus): "hey, I need to resize, but just for me so don't reshuffle the desktop and docks."
This discussion has been archived. No new comments can be posted.

A Proposal To Fix the Full-Screen X11 Window Mess

Comments Filter:
  • Hilarious excuses (Score:5, Insightful)

    by dnaumov ( 453672 ) on Thursday October 25, 2012 @08:23PM (#41772577)

    Martin Gräßlin of KDE was rather wary about the patch and said that games changing the resolution just tend to mess up the desktop.

    So, ugh... fix your desktop?

    • Agreed. Why can't the plasma widgets just save their positions and change back when the resolution changes back?

      • by Kjella ( 173770 ) on Thursday October 25, 2012 @09:30PM (#41773085) Homepage

        The desktop doesn't know what caused the changes, so you could run into a lot of strange issues. Imagine you lay out your desktop on a 30" 2560x1440 monitor, then switch to a 1920x1080 monitor and added/removed/moved an icon. What happens when you reattach the first monitor, should everything just "snap back" to the places it had - even if you'd arranged your icons completely differently now? To me the solution outlined here seems much smarter, just let the game have it's own "screen" with its own settings, no need to even tell the other windows about it.

        • Re:Hilarious excuses (Score:4, Informative)

          by Anonymous Coward on Thursday October 25, 2012 @09:50PM (#41773215)

          This is the exact purpose of this proposal, to create a new signal that would tell the window manager that the change is temporary and only takes effect while a specific window has focus. This way they window manager would know there's no need even to move the icons in the first place.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Indeed, he didn't even realize this flag wouldn't even tell the widgets the resolution changed so they would never be rearranged for starters. I doubt he has even read the proposed spec.

    • Except that we're no longer in the era of CRTs. Since LCDs have one native resolution, they should always be driven at that resolution. If a game wants to run at 640x480, then that should be accomplished by scaling it up and adding black bars, if necessary, but the signal to the LCD should still be at the original resolution.

      • by MBCook ( 132727 )
        If you don't trust your LCD to do it (I don't blame you, some LCDs are better at scaling that others), that sounds like something that should be done automatically and transparently by the video driver instead of something the WM should have to manage.
        • by Lord Byron II ( 671689 ) on Thursday October 25, 2012 @11:00PM (#41773631)

          It's not that I don't trust the LCD. It's that when you change the resolution, you tend to screw other things up as well.

          I have three monitors and I game on the center one. I like to keep my email and IRC open on the other ones while I play. But if the game adjusts the resolution, the positions of the other windows move around and I can no longer see all of them. This happens in Windows if the game doesn't run at the same resolution as my center monitor.

      • by epyT-R ( 613989 )

        or even multiples/divisors of that resolution, ideally exposed via EDID. since they're even, the screen can do a simple, lossless point sample scale which is computationally simple (compared to common bilinear) and allow these low resolutions to be full screen with no added latency (scaler chips inside most panels are sloow). These are needed because desktops might be 2560x1600 but most gpus won't run games well at that resolution.

        nvidia's windows drivers support scaling in the gpu too, but unfortunately

  • by Anonymous Coward

    Just start the goddamn games on a totally different TTY. There, problem solved!

    • Re: (Score:2, Redundant)

      by Waffle Iron ( 339739 )

      Just start the goddamn games on a totally different TTY. There, problem solved!

      That's what I do to play games.

      I usually just switch over to TTY1. Then I can load TREK73.BAS:

      *

      * *
      *
      -E-
      *

      quadrant 3/1
      condition GREEN
      torpedoes 9
      energy 1434
      shields 1000
      klingons 14

      command:

  • Music to my ears! (Score:5, Interesting)

    by DaneM ( 810927 ) on Thursday October 25, 2012 @08:33PM (#41772661)

    With Linux finally becoming a more "proper" gaming platform (i.e. Steam and others), it's "about time" that this is dealt with. _NET_WM_STATE_FULLSCREEN_EXCLUSIVE, where have you been my whole adult life? Gotta hand it to Ryan Gordon ("Icculus," as I recall) for consistently making Linux gaming that much more viable.

    • I didn't see any information in the article, but what exactly is the problem with X11 full screen support? I don't game in Linux, and this is the first time I've even heard of this.

      • Re:Music to my ears! (Score:5, Informative)

        by DaneM ( 810927 ) on Thursday October 25, 2012 @10:57PM (#41773601)

        I didn't see any information in the article, but what exactly is the problem with X11 full screen support? I don't game in Linux, and this is the first time I've even heard of this.

        The biggest issue is that when the game goes full-screen, it changes the resolution to whatever the game is set to--which may or may not be what you keep your desktop at. Then, when you exit the game, the icons are usually huge; the taskbars are usually all messed-up (even when "locked!"), and you have to futz around to make it usable again. Also, many games on Linux won't even let you Alt-Tab to other windows! Either nothing happens; or the resolution won't be correct; or the game crashes. It's really unpleasant to deal with. Also, it's worth noting that many games (especially Linux games, sadly) are extremely limited about what resolutions they'll let you use--so even if you want to set the game to your native resolution, it might not work or let you even try.

      • Re:Music to my ears! (Score:5, Informative)

        by TheRaven64 ( 641858 ) on Friday October 26, 2012 @05:16AM (#41775257) Journal

        The article contained a lot of detail. The current mechanism is a two-step thing where the application first requests full-screen control from the WM. The WM then resizes the window to fit the current screen (which may not make the game happy), removes decorations, and then gets out of the way. Then the game changes the resolution and resizes the window again. The resolution change notification is delivered to the WM, which then propagates it to all of the applications, so if you want to play a fullscreen game at 640x480 then you may find that all of your application windows have resized themselves to fit in this screen when you exit it. The game then runs (hopefully) and if it crashes then in theory the WM is responsible for restoring the video mode, but in practice it doesn't know for certain that the game was responsible for changing it, so it may not.

        With the new proposal, the game resizes its window to the resolution it wants and then requests full screen mode. The WM should then change the screen resolution to the closest to the current window size and doesn't propagate the resolution change notification to any other applications. This means that nothing else gets resized. If the game crashes, or if you do any kind of switching out of the game, then the desktop resolution will be restored.

        And, while it's fashionable to hate X11, it's worth noting that Windows and OS X both suffer from some of the problems that this proposal is intended to fix.

  • CRT's (Score:4, Insightful)

    by mcelrath ( 8027 ) on Thursday October 25, 2012 @08:34PM (#41772669) Homepage

    Who is still running a CRT? Who wants any program to change the resolution of their screen?

    This strikes me as the wrong solution to the problem: A program should instead request the "current monitor's resolution" (because there can be more than one!) set its display area to that size, and then tell the window manager to "fullscreen" it by removing title bar and border decorations and moving it to (0,0) of that monitor. But NEVER EVER RESIZE MY MONITORS. Thank you. The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.

    • I agree, I have no idea why game windows are not handled better.
      It is basically impossible to run many, quite possibly most, games in a window. And even the ones that do allow it often require editing of files or hacking the exe.
      Theoretically the OS should be being sent this visual data and no matter how it was programed you would resize it/run it in a window.

      • by Arker ( 91948 )

        Yes, theoretically, but in reality resizing stuff on the fly, particularly to odd, one-off resolutions to fit in a window, is a big performance sink - fine for some games but there is a good chunk of the market where that isnt acceptable. Plus, for many games, moving the mouse to the edge of the screen actually has a specific meaning. It's not always straightforward to determine whether you mean to throw the mouse against the edge of the screen in-game or just mean to move the mouse out of the window to che

    • Re:CRT's (Score:5, Insightful)

      by EvanED ( 569694 ) <{evaned} {at} {gmail.com}> on Thursday October 25, 2012 @08:45PM (#41772737)

      Who wants any program to change the resolution of their screen?

      Someone whose graphics card isn't up to the task of running a game at full native resolution? That'd be my guess anyway; I haven't willingly used a lower resolution for a while. (Some games don't support high resolutions, or don't support widescreen resolutions, and there it's "reasonable" that they change it as well. But a program like that probably wouldn't use that in the first place, so whatever.)

      The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.

      I don't know enough about this proposal to say how it interacts with this (indeed, I'm rather disappointed by both the summary and TFA not actually, you know, saying what the problems are in the first place), but there's absolutely no reason why those goals are in conflict. In fact, the proposal specifically addresses this: "If the window loses input focus while fullscreen, the Window Manager MUST revert the resolution change and iconify the window until it regains input focus. The Window Manager MUST protect desktop state (icon positions, geometry of other windows, etc) during resolution change, so that the state will be unchanged when the window ceases to be marked as fullscreen."

      • Re:CRT's (Score:4, Informative)

        by mcelrath ( 8027 ) on Thursday October 25, 2012 @10:02PM (#41773287) Homepage

        Someone whose graphics card isn't up to the task of running a game at full native resolution?

        For the myriad of responses that brought up this point: the answer is video card hardware scaling. E.g. add a flag _NET_WM_STATE_SCALING_ALLOWED which directs the WM to use hardware scaling from a fixed-size framebuffer, as is done by video players. Not only can you make it full screen, but you can resize it to any arbitrary shape and size (e.g. don't cover your widget bar, etc). Then the Window Manager decides what is "fullscreen". It could even make an app span more than one monitor when "fullscreen", or just one.

        • by tepples ( 727027 ) <tepples.gmail@com> on Friday October 26, 2012 @01:58AM (#41774479) Homepage Journal

          For the myriad of responses that brought up this point: the answer is video card hardware scaling.

          And this is exactly the solution that the Xbox 360 uses. A lot of newer games are fill rate limited. Because of the complexity of the pixel shaders that games use, the AMD Xenos integrated GPU in the Xbox 360 (similar to a Radeon X1800) can't run it with an acceptable frame rate at any resolution over 1024x576. So games use the Xenos's scaler to turn 1024x576 into 1280x720 or 1920x1080 pixels for the component or HDMI output.

    • Re:CRT's (Score:5, Insightful)

      by marcansoft ( 727665 ) <hector AT marcansoft DOT com> on Thursday October 25, 2012 @08:46PM (#41772753) Homepage

      This. I came here to say the same thing, but you already had. Every single modern graphics card is very efficient at scaling textures, and in fact, LCD scaling these days most often ends up happening on the GPU anyway. Don't touch my screen resolution. Ever. If the goal is to get better performance by rendering at a lower resolution, then render at a lower-resolution offscreen buffer and scale that up to the screen resolution.

      I wish Wine had a mode that did this for Windows games that expect to change the screen resolution and don't play well with Xinerama. These days I end up using the "virtual desktop" wine mode with per-game settings and KDE's window override support to put it on the right display head and remove the borders, but it's a suboptimal manual solution. The Linux game situation is slightly better (they tend to be configurable to respect the current resolution and usually get the display head right), but still don't have scaling support.

      Need inspiration? Do what video players (particularly mplayer) do. That is how fullscreen games should work.

      • by brion ( 1316 ) on Thursday October 25, 2012 @08:53PM (#41772809) Homepage
        This is exactly how some games work on Mac OS X, for instance Source-based games like Portal and Half-Life 2. They don't muck with the actual screen resolution, but just render into an offscreen buffer at whatever resolution ant blit it stretched to the full screen. Switching from the game back to other apps doesn't disturb the desktop in any way. Would definitely love to see more Linux games using this technique.
      • by poet ( 8021 )

        I want to change the resolution and I will tell you why. On 32" monitor, I can't read the text unless it runs in 720P of even 800x600.

        • I want to change the resolution and I will tell you why. On 32" monitor, I can't read the text unless it runs in 720P of even 800x600.

          Actually, unless you have become totally inured to blocky, pixelated displays what you really want is for everything to be rendered larger.

          Fortunately, many operating systems support resolution independence [wikipedia.org], which would allow you to keep your display at its high, native resolution and still draw your widgets, text, etc at a large size. This is done by changing the DPI hint in the OS so that it knows to render things larger (or smaller).

          This approach would accomplish the overall effect you desire while avoi

    • Who is still running a CRT? Who wants any program to change the resolution of their screen?

      Gamers often do. An average application might run nicely at a high resolution, but for a smooth Skyrim experience, many people may find it necessary to allow it to run at a lower resolution.

    • Re:CRT's (Score:5, Interesting)

      by DRJlaw ( 946416 ) on Thursday October 25, 2012 @08:55PM (#41772835)

      Who is still running a CRT?

      This is not a CRT-only problem.

      Who wants any program to change the resolution of their screen?

      Gamers. [hardocp.com]

      This strikes me as the wrong solution to the problem:

      Not surprising, since you're ignoring the underlying problem. Your 2560x1600 desktop on that 30" LCD is going to kill the ability of your videocard to display a modern game at an acceptable frame rate. Many gamers will not accept windowed half-screen (or whatever fraction is required) gaming on their $1K LCD.

      A program should instead request the "current monitor's resolution" (because there can be more than one!) set its display area to that size, and then tell the window manager to "fullscreen" it by removing title bar and border decorations and moving it to (0,0) of that monitor. But NEVER EVER RESIZE MY MONITORS.

      No. Windows and OSX have figured this out. Linux window managers (at least one popular one) need to as well.

      The window manager should always be superior to the app, and one should always be able to manage the window (task switch, move to another desktop, etc) using the window manager, regardless of what the app thinks it is doing.

      Irrelevant to your desired scheme, where keyboard hotkeys would still be required. In Windows and OSX you can still task switch, move to another desktop, etc. using such hotkeys. Yet the game controls the resolution of the monitor in fullscreen mode.

      • Re:CRT's (Score:4, Insightful)

        by Carewolf ( 581105 ) on Thursday October 25, 2012 @09:24PM (#41773039) Homepage

        Not surprising, since you're ignoring the underlying problem. Your 2560x1600 desktop on that 30" LCD is going to kill the ability of your videocard to display a modern game at an acceptable frame rate. Many gamers will not accept windowed half-screen (or whatever fraction is required) gaming on their $1K LCD.

        No, you are missing his point. There is no reason the game could not run at a lower resolution and be scaled by the WM, instead relying on the screen to do the rescaling. Only CRTs are able to do rescaling physically, LCDs end up doing it in software anyway and usually in a crappier maner than what the WM could do.

        • by Chemisor ( 97276 )

          Unless your game uses OpenGL and you have a fully accelerated driver (read: the proprietary Catalyst or nVidia blob), it will not be able to scale fast enough. Most games use SDL and main memory surfaces that are then blitted to the screen. Any scaling is done in software by the CPU and is dreadfully slow. My Core i7 can handle 1680x1050@60, but just barely, with one core pegged to 100%. The cheapest GPU, of course, can handle that easily, but you must run proprietary drivers and use OpenGL. If you don't, r

        • by adolf ( 21054 )

          But LCDs scale for free, while CPUs and GPUs do not.

          Why reinvent the wheel?

    • by antdude ( 79039 )

      Not me, but I want to but they are impossible to find new quality ones. Anyways, I still use low resolutions like old games, MAME, demos, etc. I still use old KVM from Y2K that use VGA so not changing resolutions and keeping black bars doesn't work. :(

    • I'm still running a hugemongous CRT. It probably won't go bad for another four years.
    • some of us have underpowered graphics cards, or, god forbid IGPs, that can't play games at full rez, and need to downscale.

      sorry, i'll go dig myself a hole and jump in it now, i'm obviously not worthy.

    • Re:CRT's (Score:4, Insightful)

      by UnknownSoldier ( 67820 ) on Thursday October 25, 2012 @10:47PM (#41773551)

      > This strikes me as the wrong solution to the problem: .. set its display area to that size
      *sigh*

      It is sad to see you unfortunately don't know what you the hell you are talking about. Let me explain:

      There are these precious little things called RAM, Video Memory, Video Buffers, DMA speed, and Scaling.

      Games typically use _at least_ *four* buffers:

      * Frame Buffer Front (32-bit)
      * Frame Buffer Back (32-bit)
      * Depth Buffer (24-bit)
      * Stencil Buffer (8-bit)

      Why should the Window Manager force the app to *over* allocate memory say @ 1920x1080 when the user has selected 640x480??

      That is, why would you have the GPU waste a total of 24 MB (8 MB FB Front + 8 MB Back + 6 MB Depth + 2 MB Stencil) compared to ONLY needing 3.6 MB (1200K + 1200K + 900K + 300K) ??

      More memory allocated for the buffers means you have LESS resident textures on the GPU.

      Also, By using a native (lower) resolution you force the monitor to use native *hardware* up-scaling.

      > and then tell the window manager to "fullscreen"
      And this is done for "free" in your fantasy world??

      Why would you force the GPU to do *extra* work of texture-copy up-scaling when it doesn't need to one in the first place if you are running at a 1:1 resolution at full-screen??

      > set its display area to that size, and then tell the window manager to "fullscreen" it by removing title bar and border decorations and moving it to (0,0) of that monitor.

      That is called "Windowed No Border Mode"

      i.e. Gamers want _3_ choices

      * Full-Screen (change resolution)
      * Windowed (don't change resolution)
      * Windowed No Border (don't change resolution)

      Lastly SOME games do NOT support arbitrary resolutions. I *want* them to fill my 22" monitor at whatever resolution they DO support. The *fonts* and the rest of the UI elements are BIGGER and easier to see when running in full-screen mode.

      Likewise, games that *only* run in full-screen mode are BADLY DESIGNED.

      The *proper* thing to do is to give the _user_ choice: Namely the 3 I listed above.

      Hope this helps explains some of the issues and WHY this solution is The Right Thing.

    • Re:CRT's (Score:5, Insightful)

      by obarthelemy ( 160321 ) on Thursday October 25, 2012 @10:52PM (#41773575)

      This is a very nice example of what is wrong with Linux. Not the actual problem. The attitude towards the problem and the users who experience it.

    • by slacka ( 713188 )

      Every heard of 2880×1800 retina displays? Like to play your games at 60FPS? Well, as someone rocking one on a 15" monitor with a mid-low end GPU, I frequently run into this issue under Linux. And let me tell you the current system is terrible. Mac OS can do it, Windows can do it, If Linux wants to get competitive, they need to fix this issue. Just a few days ago, I fired up Tux Racer for a friend to play. I think I had to reboot after that fiasco.

      I don't care how it's resolved, different TTY or _NET_WM

    • Because many games only come at a fixed resolution. Plus many screens have such an enormous resolution that games are impractical at that scale.

    • by Jonner ( 189691 )

      It's certainly a worthy goal to never need to change the monitor mode. However, I don't think we're quite there yet. Most games that rely on 3D acceleration cannot maintain the maximum frame rate at the maximum resolution supported by the monitor. Therefore, users need to be able to choose resolution to tune the game to their machine and preferences. Once frame rate is truly independent of mode, there should never be a need to reduce resolution.

  • by OhANameWhatName ( 2688401 ) on Thursday October 25, 2012 @08:41PM (#41772707)

    Martin Gräßlin of KDE was rather wary about the patch and said that games changing the resolution just tend to mess up the desktop

    KDE doesn't need the help.

  • Here's to hoping.

    Seems like a reasonable idea, given a bit of time to mature as a spec.

    So another ten years? Seriously, this is well past due. This is the second story about someone wanting to fix the desktop in the last month or so. Hopefully if there are enough one of them might actually gain traction. Here is hoping. The X system really is a heap. As much as the purists like to bitch about it, thank goodness for nvidia when it comes to multiple monitor support. Too bad it doesn't help the gaming thoug

    • Re: (Score:3, Insightful)

      by ryanw ( 131814 )

      I'm pretty sure somebody did go in and fix the X11 desktop..... It was Apple w/ OSX.

      • I'm pretty sure somebody did go in and fix the X11 desktop..... It was Apple w/ OSX.

        Do you mean DisplayPostScript from NeXT? Apple never used X11 for their primary display (ignoring AU/X and mkLinux for the sake of convenience here) so they had nothing to fix.

  • Dump X (Score:5, Insightful)

    by Anonymous Coward on Thursday October 25, 2012 @08:45PM (#41772749)

    I still think X needs to go. For truely forward thinking, it needs to be replaced. Just look at Andriod. Andriod would not be useful if it was forced to use X.

    Frankly, X does too many things that too few people need. It was designed for a different era and it scales to modern workflows very clumsily. Multi-monitor desktops and gaming on windows is effortless. On X it's frankly a chore.

    Sorry, no, network transperancy is not an important feature anymore. Probalby implemented by .001% of regular users. VNC/RDP style remote access is the way it's done now. An no, nobody cares if it's tehnically inferior. It's hundreds of times easier to implment and use.

    Modern toolkits pretty much ignore 95% of X's built in features an just pass bitmaps.

    Yeah, X has lots of cool things but you have to realize most of them are impractical or unnecessary. Today we really have the memory, computational power, and bandwith to bang whatever we want on to the screen with out any trouble. The latency and overhead X present are the enemies today.

    Now stop. - Yes you, stop. I know you're about to type up a 10 paragraph screed about how you ported X ap to obscure platform Y, or remotely managed 10,000 servers with facial twitches and GTK. Just stop. Your use case does not represent the vast majority of computer users. It doesn't even represent a full fraction of a percent.

    Legacy baggage and clinging to old ideas spawned x.org. The same thing is what will spawn whatever is to replace X.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      This is a lot of FUD.

      Android. Look at the N9 with an award winning UI. It uses X and is really cool (on outdated hardware).

      Network transparency is really useful. VNC/RDP sucks compared to X. And I don't see how it is
      easier to use than X. Maybe there are more GUI for it that make it easier for beginners, but that
      is not for any technical reasons.

      I don't see what overhead X causes. I worked fine decades ago. Latency is an issue over the network,
      but only because the toolkits never cared about that. It's not a p

    • by deek ( 22697 )

      I agree that we need to come up with a brand new system to handle today's graphics systems. That's what Wayland is for, and why it's such an interesting project. It is not legacy baggage, but a ground up designed system. You have heard of it, haven't you? Seems like every Linux user and their dog knows about it these days.

      Also, I'm very glad that Wayland is implementing an X compatibility layer. I'm one of those fraction of a percent that use and enjoy network transparency. It would annoy the hell out

      • Re: (Score:2, Informative)

        by agrif ( 960591 )

        Wayland has an X compatibility layer, sure, but you may also be pleased to know that there are efforts underway to get native Wayland network transparency.

        See, the cool thing about wayland is the protocol is written to be flexible enough to have some other program handle the network transparency part, seamlessly. It's not part of the core design of wayland simply because it doesn't have to be in the core at all.

        An added bonus of this flexibility is the ability to do network-tranpsarency things that even X c

        • by deek ( 22697 )

          That ... is awesome news! Thanks for that information. I hadn't heard anything about it.

          The X compatibility layer will still be useful regardless, but I'm very glad to hear that Wayland will likely have network transparency as well. The more I hear about this system, the more I'm liking it.

  • It is a bit unusual to craft a news entry with deeply technical stuff taken from project mailing lists. What is _NET_WM_STATE_FULLSCREEN_EXCLUSIVE? A flag in some protocol?
    • The issue is that some programs change the screen resolution, and different programs take notice and rearrange their windows and icons when a screen resolution change notification takes place.

      The problem is that there are no semantics in X that allow a program to change the screen resolution while NOT causing those other programs to do stuff.

      This new flag is to signal these semantics. "Hey, we are changing the resolution, but we have this new idea called Exclusive Control over the display, so nobody nee
      • And the flag is passed to an API? (libX11 level? higher?), or it lives within the X11 protocol? Or both? I understand the background, I was just saying it was weird to use a flag name as being #define'ed in source code without the context required for it to make any sense.
      • by jedidiah ( 1196 )

        I dunno. If a game is running amok because gamers and game programmers suffer from an 80s mentality that a computer is a game console, then perhaps you don't want the rest of the GUI acknowledging this foolishness.

        The fact that games on Linux don't scramble my desktop like they do under Windows IS ACTUALLY A GOOD THING.

        Even with the status quo, cleaning up after a game run amok is less bothersome under Linux.

  • And then make sure that different versions of it cant coexist on the same system and cant run each others code. Perhaps change all the method calls every build.
  • When a game starts, it wants the entire desktop, it doesn't want the other desktop elements at all, no dock, no icons, interaction, etc.

    Why isn't there a function to create a new virtual desktop at any resolution you want and leave the other desktop untouched? So when you switch between them it knows to switch resolutions as well. Have the resolution tag part of the desktop, so when you switch between them it knows what to switch to.

    Seems like an easy fix.

  • by mattdm ( 1931 ) on Thursday October 25, 2012 @09:11PM (#41772969) Homepage

    I don't know if kids today remember, but Loki Games was one of the first commercial plays for big name games on Linux. Ended in tragic business troubles and financial doom.

    It warms my heart to see that Sam Lantinga is still working on SDL.

    That is all.

  • by smugfunt ( 8972 ) on Thursday October 25, 2012 @09:53PM (#41773243)

    Not sure what 'mess' is referred to in the title but I sidestepped the issues I met with Baldur's Gate by running it in its own X-server on a separate VT.
    As I recall it just took a simple script to start the server and the game, and one extra command to make the mouse cursor less fugly. My main desktop remained completely undisturbed and just an Alt-F7 away. A little polish and this approach could be a good general solution, no?

  • by uvajed_ekil ( 914487 ) on Thursday October 25, 2012 @10:54PM (#41773581)
    Why hasn't Linux taken off yet on the mainstream desktop (er, laptop)? Why don't average folks want to run Linux yet? Isn't it ready for prime time?
  • ...that make me realize I may not really belong here.

"Here's something to think about: How come you never see a headline like `Psychic Wins Lottery.'" -- Comedian Jay Leno

Working...