Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Games Entertainment

Do Games Know The Secret Of UI? 256

A reader writes "There is a nice interview at the BBC talking about how computer games are the ones pushing the envelope. Particularly interesting is it doesn't just deal with the tech aspects, but goes into the user interface aspect as well." Having conversed with her on a number of occasions, I can attest to JC being smart. Good interview.
This discussion has been archived. No new comments can be posted.

Do Games Know The Secret Of UI?

Comments Filter:
  • Your BBC links (Score:2, Insightful)

    by praedor ( 218403 ) on Friday August 31, 2001 @02:22PM (#2240090) Homepage

    You know, just about every damn time I try to connect to the BBC site via slashdot (including with this story) it doesn't work. There appears to be something REALLY dicked about a lot of DNS servers. I suggest that from now on, instead of linking to the bbc URL you guys use the IP address, which always works.

    MOST of the time the BBC url is broken and gives an IMMEDIATE "unknown host" message. Type in the IP and viola! Instant connection.

  • From Experience... (Score:5, Insightful)

    by keesh ( 202812 ) on Friday August 31, 2001 @02:23PM (#2240093) Homepage
    Gamers want fancy interfaces. I know someone who's a huge fan of Civ, Alpha Centauri et al., but when I introduced him to FreeCiv his first comment was "the interface sucks". This isn't someone who's computer illiterate, either.

    It seems that people want something different when playing a game. They don't want just their standard operating system look, they want fullscreen fancy eyecandy, even when that's not the nicest option.

    You can even see this in game editors -- AFAIK, WorldCraft is the only editor even close to the standard OS style...

    Whether it's because the whole screen should look SciFi / Fantasy / Whatever, or simply because users want something different, game interfaces have to be different from usual programs.
  • by garcia ( 6573 ) on Friday August 31, 2001 @02:23PM (#2240094)
    remember there are other applications (other than just the military and games as she mentioned) that use most of the CPU (RC5, Netscape ;))

    this really has little to do w/UI. It has to do w/what she feels is important in the industry at this time (cell phones that are connected).

    It's true that games love faster CPUs but it is also true that it is probably possible to make much faster/better games in the standard constraints that we already have but people don't care to do that anymore (remember 64k games that looked cool as hell or even 4mb games?)

    Sending your picture in front of the Eiffel tower to your kids on your cell phone is less important than decreasing the bloat!
  • by sheetsda ( 230887 ) <> on Friday August 31, 2001 @02:30PM (#2240130)
    The only thing that will push a computer to its limits is a game. No one admits it but no one needs a new computer to do a spreadsheet programme or Word document.

    *sigh* This is what I tried to tell my uncle last weekend when he shelled out way too much money for a 1.4 GHz P4 with a Geforce2 and 128 megs of RAM to run Microsoft Windows/Office. He believes buying a top of the line system now will save him from having to buy another one in a couple years. Ha! Good luck. Lusers just won't listen.

  • Whatever (Score:2, Insightful)

    by swagr ( 244747 ) on Friday August 31, 2001 @02:33PM (#2240145) Homepage
    What Hertz SHOULD have said is that games are the only commercial applications used by the masses that maximize CPU useage ...

    Yes, I'm sure no one has ever maxed a CPU for hours or days on end modelling fluid dynamics, or physical optics, or encoding mpegs, or ...

  • by tim_maroney ( 239442 ) on Friday August 31, 2001 @02:36PM (#2240166) Homepage
    Incremental disclosure with sticky adaptation, the single UI principle discussed in the interview, has been well known in the design community since the 1980's.

    Just because Microsoft doesn't make good use of the principle doesn't mean that it's a gift from gaming to the rest of the world.

    In most other ways, games are UI nightmares. They're difficult by design. Applying their principles to other domains would be a giant step backwards. Non-entertainment systems should be easy by design, rather than conjuring obstacles for the thrill of overcoming them.

    Fans of UNIX will, of course, disagree. The popularity of archaic command-line interfaces in the UNIX subculture could perhaps be understood as a consequence of gamer-like behavior among hobbyists and tinkerers.

  • by Midnight Thunder ( 17205 ) on Friday August 31, 2001 @02:37PM (#2240174) Homepage Journal
    Another point that should be mentioned is that with faster processors new types of applications become accessible to consumers. Imagine trying to edit your home video on your computer, or trying to do other creative work, on a computer four years ago and it would not have been possible.

    The way I see it, is that while games push the envelope, faster processors make new kinds of applications available and the interest in those applications also help people want faster computers.

    We all use word-processors and spread-sheets but there also a lot of people who also want to be creative with their computers.
  • Wiggle Room (Score:2, Insightful)

    by Foggy Tristan ( 220356 ) on Friday August 31, 2001 @02:42PM (#2240201)
    I'd disagree that games necessarily are better for UI development, it's just that games have a lot more wiggle room in terms of bad user interface. A game like Leisure Suit Larry can get away with not having standard looking buttons, and a game like Myst III: Exile can get away with not having standard looking icons.

    It doesn't mean however that games can have bad UIs. The eGames sample I stupidly picked up has one of the worst interfaces possible, and most of the games are individually difficult to manage.

    And finally, it's worth pointing out there's no standard UI for a laser blaster. ("The cross-sight must be in red, with a slightly thicker line near the center...")
  • by mughi ( 32874 ) on Friday August 31, 2001 @02:48PM (#2240237)

    Good post, but...

    When I'm playing a game, I want to be immersed in a virtual world. When I'm writing, or designing graphics for a Web site, or pounding out code, or looking for information on some obscure subject, I want a clean, simple interface that makes it as easy as possible for me to get, create, or manipulate my data. And that's it.

    I'd have to disagree and say that the basic principle is the same. When I'm playing a game (Myth II for example), I want to focus on paying attention to the health of my units, where I want to get them to go, and not have to worry about the mechanics of actually achieving it. When I'm writing a document, I want to focus on my train of thought, what I'm trying to say, etc. and not have to worry about the mechanics of using the word processor. Different paradigm, but same UI goal.

    I would say that many games I've played seem to have gotten this down well. Perhaps it's because of the focus where they know that no player is going to actually bother reading the manual, and the developers need to keep in mind the needs of a novice user just sitting down at the program for the first time.

    The game 'tutorial' intro level and the wavy green lines in Word: both good steps along this path.

  • by KFury ( 19522 ) on Friday August 31, 2001 @02:51PM (#2240251) Homepage
    There's a lot to be said for consistancy in UI. While games introduce some daring new metaphors and interaction models, it doesn't do a whole lot of good when each iteration forces you to relearn several of the skills you already learned (this, by the way, is also my beef with Mac OS X. People learn how to use a finder and you make them use a totally new one!)

    On the simplest level it's things like the 'inverted mouse' problem in FPS games, but whenever a hot game developer figures out a cool way to convey manipulation of another custom game feature, it detracts from the learning curve.

    It's a shame that 'pushing the envelope' and 'consistancy of design' are orthogonal terms. It would be great of the game designers got together and admitted that they're each trying to make the better game, but that establishing consistant design patterns for interactivity can increase the playability of all games, and let the struggle be with the puzzles, and not the interface.
  • by Brownstar ( 139242 ) on Friday August 31, 2001 @02:54PM (#2240275)
    You don't play RPG's very often do you. I can think of plenty of times that at a particular point in a game I needed to do something (but my character didn't have enough experience to do it) to continue with the game.
  • by WinterSolstice ( 223271 ) on Friday August 31, 2001 @02:58PM (#2240293)
    Hey, what about those of us who would like to have lighting and fog effects when writing letters? I think it would be seriously cool if the next "Cease and Desist" letter I got had really cool real-time smoke.

    I agree that an interface should be straight-forward, and simple. However, users LOVE eye candy. Just go look at We actually have users at my company who run PowerPoint on their desktops. They like having desktop wallpaper, and our policies prohibit it. They are willing to take the performance hit just for that useless bit of color.

    As for me, I'll just sit back and enjoy my heavily tuned Enlightenment desktop that uses more RAM and CPU than my first 6 computers had, combined.


  • by Brownstar ( 139242 ) on Friday August 31, 2001 @03:01PM (#2240309)
    I think you're confused as to what the user interface in a game is.

    The special effects like fog and realistic lighting are part of what is being presented, you don't ever actually use it. The user interface is the menus, hand icon, etc...

    One of the reason's why you may have mistaken that is because UIs in good games have gotten so seemless with the game its hard to tell the UI from the actual game (take Black & White for example).
  • by Junks Jerzey ( 54586 ) on Friday August 31, 2001 @03:01PM (#2240310)
    Most game UIs are written with custom code, not huge object-oriented libraries. And they tend to be very usable and snappy on what amounts to low-end hardware (thinking of game consoles here). Compare this to any method of creating a UI for your favorite OS, whatever it may be. It is an order of magnitude easier to write a game-like UI from scratch than it is to learn to use any of the various UI toolkits, even if you already know those toolkits.

    Along those lines, I am continually amazed when Windows XP (or the even a new KDE or whatever) requires significantly more CPU power than the previous version. Does handling clicks on widgets _really_ take that much processing power? We just blindly assume "oh yeah, context sensitive help, that's _gotta_ be expensive." But c'mon, these things could have been lightning fast on the Commodore 64.
  • by mblase ( 200735 ) on Friday August 31, 2001 @03:02PM (#2240318)
    ...since every iteration of the Microsoft or Apple OS requires more RAM, a faster processor, and more colors on the monitor, I think it's more accurate to say that no one needs a new computer to do a spreadsheet program or Word document, provided they don't want to use the latest version.

    And besides, there's more to a computer than just the processor and graphics card. I've got a three-year-old PowerMac clone sitting at home, and I can't hardly use it for anything new. It does its job fine, but all its hardware is legacy -- DIMMs, SCSI, and serial ports while everything else is moving to SDRAM, FireWire, and USB. This phenomenon exists in the PC world as well, just to a lesser degree. If I want to upgrade my machine, it's ironic that it will cost me more money than if I had a brand-new one with USB and SDRAM on the motherboard.

    In other words, then: it also costs me more to make my machine compatible with a Palm handheld, a digital camera, a joystick, or a new printer, I need to spend the money to upgrade it first. If I want to do anything like digital video, I have to upgrade it a lot. Even downloaded Flash multimedia ran slow until I upgraded the processor, and I sure can't add an MP3 jukebox without a substantial hard drive upgrade (2 gigs doesn't go as far as it used to).

    Games push the envelope harder than anything else in the consumer industry, true. But it's hardly the only thing. There's more to consumer PCs these days than video games and word processing, and it's all more demanding than it used to be.
  • by Daniel Dvorkin ( 106857 ) on Friday August 31, 2001 @03:02PM (#2240321) Homepage Journal
    Sure, but in that case the fog is an abstract representation of something else happening -- whereas in a game, the purpose of fog effects is _actually to look like fog_. I think that may be _the_ fundamental difference between games and general-purpose apps, in fact. Games try to create a virtual world and put you into it; most other kinds of apps try to provide you with a useful metaphor through which you can manipulate the real world.
  • Perfect UI (Score:2, Insightful)

    by mcelli ( 518034 ) on Friday August 31, 2001 @03:04PM (#2240333)
    In response to the title of this post: Games do have the secret to the UI because they are single task programs. Saying a game has the perfect UI is like saying a Toaster has the perfect UI. I think that the number one rule of a UI is the less you can do, the easier it is to do it.

    The article brings up some good points about making things more real, but personally, it's no more real to me now that it was in the days of Coleco Vision. Final Fantasy X doesn't make me feel any more like I'm "in the game" than Final Fantasy I did. Graphics and presentation have obviously gotten better, but that's only made games nicer to look at, and hasn't made them any more real for me.

    I'd like to hear people's comments on whether or not these graphics bring a sense of realism. I equate it to the change from say twm to GNOME/KDE, it's prettier, but it's not any more "real".

  • by mattecc ( 109051 ) on Friday August 31, 2001 @03:28PM (#2240464)
    I most certainly agree that games are driving a lot of innovation in all parts of software.

    I think the reason is simple though. Since games have such a short lifetime, the designers are always free to try radically new ideas. If it works out, great. If not, oh well, they can try something better the next time.

    They also have users who don't mind and actually expect to start from square one, so games don't have as a design goal being as minimally invasive as possible upon the existing instincts of the user.

  • by cpt kangarooski ( 3773 ) on Friday August 31, 2001 @03:31PM (#2240489) Homepage
    Which leads to something I've been saying for a while. The GUI and CLIs should be extremely tightly integrated. That isn't to say that it would ever, ever, be required that a user use an interface he was uncomfortable with. The two different methods would be alternatives of each other, but which would be more than the sum of their parts when used in tandem.

    Having three or four terminals open in XWindows is _not_ an example of this, by any means.

    For example, imagine that you want to move all your object files, plus a few others that don't have anything in common. (save to you - i.e. not the same name, or file type, etc.)

    You could quickly navigate to the appropriate directory in the GUI - it's faster unless you remember the precise (short) path. Type a command along the lines of "select *.o" into the cli parser of that _very_ GUI directory window, and the appropriate _icons_ highlight, and are selected. Quickly mouse around to the other couple of icons you want, and shift-click to add them to the selection.

    Then drag the icons from the window into another folder visible onscreen (which may be easier than having to remember and type in another pathname), change over to that window and enter a command like "rename * *.backup" to rename all of the moved files.

    (n.b. command names would likely exist in several forms, with the full name of the command being the easiest to understand - for consistancy's sake, it would be precisely the same name as used in the GUI.)

    Both pointing & grunting at things, as well as talking about them are good ways to control a computer. In the real world, we recognize the usefulness of using them in conjunction, rather than either exclusively. There's a place for that here too.
  • by xxxtac2 ( 248028 ) on Friday August 31, 2001 @03:42PM (#2240547) Homepage
    I personally think that games do really push the envelope on UI design. Take games like Black and White that use gesture based control. This would be a great ability in many pieces of real software. Imagine being able to trigger filters or switch drawing tools in photoshop by simply making quick gestures, the learning curve would be a draw back but it would be the same as hotkeys and key combinations, new users wouldnt be effected but power users would learn to use them and theyd become a natural efficiency booster.
    This is just one good example of a UI feature used in a game that would be very useful in real software applications. Sure many games have stupid and unnatural interfaces, but many also have strong elements that could prove to be immensely useful in the future
  • Not quite right (Score:3, Insightful)

    by alexjohns ( 53323 ) <> on Friday August 31, 2001 @04:41PM (#2240864) Journal
    Games only push speed of the processor and the video card. That's it. Most games play off the CD, so they don't push the size of the hard drive. They could care less about your printer, scanner, or anything else like that. Most big software packages require more RAM than any game. I have 512MB at work not because I run games.

    So Intel and AMD love games. I imagine RAM manufacturers like bloated office app developers, and bloated OS developers - MS springs to mind. CD player/recorder makers like musicians. Printer makers like business and old people who want a hard copy of everything. Scanner makers love the internet for wanting everyone to share their pictures.

    So companies like HP could conceivably help their bottom line by supporting musicians, longevity drugs, and getting more people on the internet. How about that. Someone should tell Bruce Perens.
  • by prnz ( 33032 ) on Friday August 31, 2001 @06:18PM (#2241263)
    Well, there are both good and bad ways to use adaptive UI, just as with anything else in UI design. "Personalized menus" happens to be a example of one that's particularly annoying.

    In addition to organizing commands into categories, menus already hide them away until they're needed; that's the whole point. Selective-display menus hide the commands even more and just add extra steps to get to them when they're needed. What's worse, the user has been learning to use the interface, and using a hidden command moves it to the "recently used" list. This adds it back to the truncated menu, often rearranging it so any benefit from learning the shortened version is now lost, and the user has to relearn or retrain muscle memory. I hate it too, and I always turn it off ("adapting" the UI in a way that works).

    A better use of adaptive UI is not to change the layout or components out from under the user, but to look for patterns of usage and facilitate those. For example, if a user consistently follows action 'a' with actions 'g' and 'i', the adaptive UI could recognize this and ask politely if these should be combined into a single action, perhaps putting a button on the toolbar if a-g-i is frequently used. That's a pretty simple example, but it shows that adaptation isn't necessarily all bad.

    Adaptive UI will probably develop like graphical UIs have in the past: by trial-and-error to see what works and what doesn't when you put it in front of the user. Most of it probably won't, probably because UIs are designed either by programmers who often have a hard time separating the internals of the program from the way it's used, or by marketing folks who think that more gimmicks, flash, and colors equals better.

    It's my feel that adaptive UIs that complement rather than hinder both learning and experienced users are still off in the future (but maybe that's because I'm on my second read through The Diamond Age and thinking of the Primer).

    Just my 2 1909-S VDBs,
  • by Anonymous Coward on Saturday September 01, 2001 @12:26AM (#2242172)

    Imagine trying to edit your home video on your computer, or trying to do other creative work, on a computer four years ago and it would not have been possible.

    Heh, another guy who never heard of the Amiga.

    You're right, of course. 4 years ago, you couldn't buy a home computer to do that. But 8 years ago you could! :)

Man is an animal that makes bargains: no other animal does this-- no dog exchanges bones with another. -- Adam Smith