Do Games Know The Secret Of UI? 256
A reader writes "There is a nice interview at the BBC talking about how computer games are the ones pushing the envelope. Particularly interesting is it doesn't just deal with the tech aspects, but goes into the user interface aspect as well." Having conversed with her on a number of occasions, I can attest to JC being smart. Good interview.
Your BBC links (Score:2, Insightful)
You know, just about every damn time I try to connect to the BBC site via slashdot (including with this story) it doesn't work. There appears to be something REALLY dicked about a lot of DNS servers. I suggest that from now on, instead of linking to the bbc URL you guys use the IP address, which always works.
MOST of the time the BBC url is broken and gives an IMMEDIATE "unknown host" message. Type in the IP and viola! Instant connection.
From Experience... (Score:5, Insightful)
It seems that people want something different when playing a game. They don't want just their standard operating system look, they want fullscreen fancy eyecandy, even when that's not the nicest option.
You can even see this in game editors -- AFAIK, WorldCraft is the only editor even close to the standard OS style...
Whether it's because the whole screen should look SciFi / Fantasy / Whatever, or simply because users want something different, game interfaces have to be different from usual programs.
games aren't the only thing that uses 100% CPU. (Score:3, Insightful)
this really has little to do w/UI. It has to do w/what she feels is important in the industry at this time (cell phones that are connected).
It's true that games love faster CPUs but it is also true that it is probably possible to make much faster/better games in the standard constraints that we already have but people don't care to do that anymore (remember 64k games that looked cool as hell or even 4mb games?)
Sending your picture in front of the Eiffel tower to your kids on your cell phone is less important than decreasing the bloat!
Nothing pushes a computer like games. (Score:3, Insightful)
*sigh* This is what I tried to tell my uncle last weekend when he shelled out way too much money for a 1.4 GHz P4 with a Geforce2 and 128 megs of RAM to run Microsoft Windows/Office. He believes buying a top of the line system now will save him from having to buy another one in a couple years. Ha! Good luck. Lusers just won't listen.
Whatever (Score:2, Insightful)
Yes, I'm sure no one has ever maxed a CPU for hours or days on end modelling fluid dynamics, or physical optics, or encoding mpegs, or
incremental disclosure and game UI (Score:5, Insightful)
Just because Microsoft doesn't make good use of the principle doesn't mean that it's a gift from gaming to the rest of the world.
In most other ways, games are UI nightmares. They're difficult by design. Applying their principles to other domains would be a giant step backwards. Non-entertainment systems should be easy by design, rather than conjuring obstacles for the thrill of overcoming them.
Fans of UNIX will, of course, disagree. The popularity of archaic command-line interfaces in the UNIX subculture could perhaps be understood as a consequence of gamer-like behavior among hobbyists and tinkerers.
Tim
Re:games aren't the only thing that uses 100% CPU. (Score:4, Insightful)
The way I see it, is that while games push the envelope, faster processors make new kinds of applications available and the interest in those applications also help people want faster computers.
We all use word-processors and spread-sheets but there also a lot of people who also want to be creative with their computers.
Wiggle Room (Score:2, Insightful)
It doesn't mean however that games can have bad UIs. The eGames sample I stupidly picked up has one of the worst interfaces possible, and most of the games are individually difficult to manage.
And finally, it's worth pointing out there's no standard UI for a laser blaster. ("The cross-sight must be in red, with a slightly thicker line near the center...")
Re:Games pushing hardware is great ... (Score:2, Insightful)
Good post, but...
I'd have to disagree and say that the basic principle is the same. When I'm playing a game (Myth II for example), I want to focus on paying attention to the health of my units, where I want to get them to go, and not have to worry about the mechanics of actually achieving it. When I'm writing a document, I want to focus on my train of thought, what I'm trying to say, etc. and not have to worry about the mechanics of using the word processor. Different paradigm, but same UI goal.
I would say that many games I've played seem to have gotten this down well. Perhaps it's because of the focus where they know that no player is going to actually bother reading the manual, and the developers need to keep in mind the needs of a novice user just sitting down at the program for the first time.
The game 'tutorial' intro level and the wavy green lines in Word: both good steps along this path.
Pushing the oxymoronic UI envelope (Score:5, Insightful)
On the simplest level it's things like the 'inverted mouse' problem in FPS games, but whenever a hot game developer figures out a cool way to convey manipulation of another custom game feature, it detracts from the learning curve.
It's a shame that 'pushing the envelope' and 'consistancy of design' are orthogonal terms. It would be great of the game designers got together and admitted that they're each trying to make the better game, but that establishing consistant design patterns for interactivity can increase the playability of all games, and let the struggle be with the puzzles, and not the interface.
Re:I can just see it... (Score:2, Insightful)
Re:Games pushing hardware is great ... (Score:2, Insightful)
I agree that an interface should be straight-forward, and simple. However, users LOVE eye candy. Just go look at Themes.org. We actually have users at my company who run PowerPoint on their desktops. They like having desktop wallpaper, and our policies prohibit it. They are willing to take the performance hit just for that useless bit of color.
As for me, I'll just sit back and enjoy my heavily tuned Enlightenment desktop that uses more RAM and CPU than my first 6 computers had, combined.
-WS
Re:Games pushing hardware is great ... (Score:3, Insightful)
The special effects like fog and realistic lighting are part of what is being presented, you don't ever actually use it. The user interface is the menus, hand icon, etc...
One of the reason's why you may have mistaken that is because UIs in good games have gotten so seemless with the game its hard to tell the UI from the actual game (take Black & White for example).
If nothing else, game UI's are focused (Score:4, Insightful)
Along those lines, I am continually amazed when Windows XP (or the even a new KDE or whatever) requires significantly more CPU power than the previous version. Does handling clicks on widgets _really_ take that much processing power? We just blindly assume "oh yeah, context sensitive help, that's _gotta_ be expensive." But c'mon, these things could have been lightning fast on the Commodore 64.
This isn't technically true... (Score:5, Insightful)
And besides, there's more to a computer than just the processor and graphics card. I've got a three-year-old PowerMac clone sitting at home, and I can't hardly use it for anything new. It does its job fine, but all its hardware is legacy -- DIMMs, SCSI, and serial ports while everything else is moving to SDRAM, FireWire, and USB. This phenomenon exists in the PC world as well, just to a lesser degree. If I want to upgrade my machine, it's ironic that it will cost me more money than if I had a brand-new one with USB and SDRAM on the motherboard.
In other words, then: it also costs me more to make my machine compatible with a Palm handheld, a digital camera, a joystick, or a new printer, I need to spend the money to upgrade it first. If I want to do anything like digital video, I have to upgrade it a lot. Even downloaded Flash multimedia ran slow until I upgraded the processor, and I sure can't add an MP3 jukebox without a substantial hard drive upgrade (2 gigs doesn't go as far as it used to).
Games push the envelope harder than anything else in the consumer industry, true. But it's hardly the only thing. There's more to consumer PCs these days than video games and word processing, and it's all more demanding than it used to be.
Re:Games pushing hardware is great ... (Score:2, Insightful)
Perfect UI (Score:2, Insightful)
The article brings up some good points about making things more real, but personally, it's no more real to me now that it was in the days of Coleco Vision. Final Fantasy X doesn't make me feel any more like I'm "in the game" than Final Fantasy I did. Graphics and presentation have obviously gotten better, but that's only made games nicer to look at, and hasn't made them any more real for me.
I'd like to hear people's comments on whether or not these graphics bring a sense of realism. I equate it to the change from say twm to GNOME/KDE, it's prettier, but it's not any more "real".
Disposable software makes it possible (Score:2, Insightful)
I think the reason is simple though. Since games have such a short lifetime, the designers are always free to try radically new ideas. If it works out, great. If not, oh well, they can try something better the next time.
They also have users who don't mind and actually expect to start from square one, so games don't have as a design goal being as minimally invasive as possible upon the existing instincts of the user.
Re:incremental disclosure and game UI (Score:5, Insightful)
Having three or four terminals open in XWindows is _not_ an example of this, by any means.
For example, imagine that you want to move all your object files, plus a few others that don't have anything in common. (save to you - i.e. not the same name, or file type, etc.)
You could quickly navigate to the appropriate directory in the GUI - it's faster unless you remember the precise (short) path. Type a command along the lines of "select *.o" into the cli parser of that _very_ GUI directory window, and the appropriate _icons_ highlight, and are selected. Quickly mouse around to the other couple of icons you want, and shift-click to add them to the selection.
Then drag the icons from the window into another folder visible onscreen (which may be easier than having to remember and type in another pathname), change over to that window and enter a command like "rename * *.backup" to rename all of the moved files.
(n.b. command names would likely exist in several forms, with the full name of the command being the easiest to understand - for consistancy's sake, it would be precisely the same name as used in the GUI.)
Both pointing & grunting at things, as well as talking about them are good ways to control a computer. In the real world, we recognize the usefulness of using them in conjunction, rather than either exclusively. There's a place for that here too.
Good examples of innovation in games (Score:2, Insightful)
This is just one good example of a UI feature used in a game that would be very useful in real software applications. Sure many games have stupid and unnatural interfaces, but many also have strong elements that could prove to be immensely useful in the future
Not quite right (Score:3, Insightful)
So Intel and AMD love games. I imagine RAM manufacturers like bloated office app developers, and bloated OS developers - MS springs to mind. CD player/recorder makers like musicians. Printer makers like business and old people who want a hard copy of everything. Scanner makers love the internet for wanting everyone to share their pictures.
So companies like HP could conceivably help their bottom line by supporting musicians, longevity drugs, and getting more people on the internet. How about that. Someone should tell Bruce Perens.
Re:Adaptive UI Question - reply with your answer (Score:2, Insightful)
In addition to organizing commands into categories, menus already hide them away until they're needed; that's the whole point. Selective-display menus hide the commands even more and just add extra steps to get to them when they're needed. What's worse, the user has been learning to use the interface, and using a hidden command moves it to the "recently used" list. This adds it back to the truncated menu, often rearranging it so any benefit from learning the shortened version is now lost, and the user has to relearn or retrain muscle memory. I hate it too, and I always turn it off ("adapting" the UI in a way that works).
A better use of adaptive UI is not to change the layout or components out from under the user, but to look for patterns of usage and facilitate those. For example, if a user consistently follows action 'a' with actions 'g' and 'i', the adaptive UI could recognize this and ask politely if these should be combined into a single action, perhaps putting a button on the toolbar if a-g-i is frequently used. That's a pretty simple example, but it shows that adaptation isn't necessarily all bad.
Adaptive UI will probably develop like graphical UIs have in the past: by trial-and-error to see what works and what doesn't when you put it in front of the user. Most of it probably won't, probably because UIs are designed either by programmers who often have a hard time separating the internals of the program from the way it's used, or by marketing folks who think that more gimmicks, flash, and colors equals better.
It's my feel that adaptive UIs that complement rather than hinder both learning and experienced users are still off in the future (but maybe that's because I'm on my second read through The Diamond Age and thinking of the Primer).
Just my 2 1909-S VDBs,
Paul
Re:games aren't the only thing that uses 100% CPU. (Score:1, Insightful)
Imagine trying to edit your home video on your computer, or trying to do other creative work, on a computer four years ago and it would not have been possible.
Heh, another guy who never heard of the Amiga.
You're right, of course. 4 years ago, you couldn't buy a home computer to do that. But 8 years ago you could! :)