Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Displays Graphics Games

Console Makers Scaling Back Their Push For HD 221

The big news about game consoles of late are the recent price drops and hardware changes. However, an editorial at GamesIndustry looks into one of the side effects of those updates: decisions by both Microsoft and Sony not to include HDMI cables with their HD-capable consoles, despite the companies' long-standing interest in high-definition gaming. "From the perspectives of these companies, they want to include the cable which will be of most utility to the largest group of consumers possible, and it's clear that whatever research they have done suggests that the majority of consumers don't need — or rather, can't use — an HDMI cable. Neither firm wants to put an assortment of cables in the box 'just in case' — each additional cable erodes millions from the firm's profitability, after all. ... Supporting evidence that all is not well with the HD transition comes from Epic Games' Mark Rein, who told Eurogamer earlier this summer that 'over half the users who played Gears of War 2 so far do not have HDTVs.' Gears of War is a core gamer franchise, beloved of early adopters and the [so-called] hardcore. If less than half of those users are playing on HDTVs, what must the percentages be like for games like FIFA and Pro Evolution Soccer — let alone Singstar and Buzz, or popular movie tie-in titles?"
This discussion has been archived. No new comments can be posted.

Console Makers Scaling Back Their Push For HD

Comments Filter:
  • by Shivani1141 ( 996696 ) on Sunday September 06, 2009 @03:52AM (#29329667)
    As a an air-conditioning technician, I work in peoples homes, typically six or more a day. From my own admittedly anecdotal experience, the percentage of my customers who have an HDTV set in the livingroom is quite close to 100. That being said, the "hardcore" "core gamer" markets are often teenaged males who happen to have the family's old set in their bedroom with the console connected to it. I'd argue that the percentage of casual gamers that play using an HDTV is higher than that of the "Hardcore" Gamers.
  • HDTV input lag (Score:5, Informative)

    by Grieviant ( 1598761 ) * on Sunday September 06, 2009 @04:37AM (#29329863)

    The hardcore gaming crowd is well aware of the fact that many HDTVs exhibit a significant amount of input lag (delay caused processing and buffering of the video signal in HDTVs). It's the type of thing a casual gamer might not really be aware of until they play on a different TV because you tend to adjust to whatever you're playing on. Most TVs and monitors don't even publish it among the main specs even though it usually dwarfs response time. It really can have a serious effect on gameplay, particularly in fast-paced FPS games (though Gears is rather slow-paced). I didn't really notice the difference until I started playing on a smaller monitor instead of my larger HDTV.

    CRTs are the still best choice for minimizing input lag, but most LCD monitors are decent as well. I'm not sure if this is mainly due to their smaller size or that they're designed for quick response to mouse movement (whereas TVs are designed for viewing, so a few tens of milliseconds extra lag is of no consequence).

    http://en.wikipedia.org/wiki/Input_lag [wikipedia.org]

  • Re:HDTV input lag (Score:5, Informative)

    by Microlith ( 54737 ) on Sunday September 06, 2009 @05:07AM (#29329949)

    It's why most LCD TVs have special modes for game consoles. On my Sharp Aquos it's called "Vyper Drive" but all it does is turn off any processing and scaling, the result being no lag.

    I have this setting active for several analog inputs and the input from my PC and I've had no issues at all with games of any kind on my HDTV.

  • by EnglishTim ( 9662 ) on Sunday September 06, 2009 @05:11AM (#29329961)

    Was anyone really gaming at 1600x1200 back then? The top-end card of the time was the nVidia RIVA TNT2. The high-end units had 32MB of RAM, but most had 16. To get a 32bit double buffered framebuffer and a 16bit Z-buffer you'd need 18MB of memory; for a 16bit framebuffer you'd need 11MB. Although the hardware would have had just enough memory bandwidth to do 30fps at that resolution I doubt you'd have hit it with most games. Also, around that time most people were still stuck on 15" monitors with 17" considered the high end with the occasional crazy bastard with a 19" monitor. 1600x1200 wasn't really a sensible resolution unless you had a 19" monitor or larger.

    No, back then I remember most people were still gaming at 640x800 or 800x600, with the higher end at 1024x768. (Of course, back then the majority of console gamers were still at 320x240 or something similar...)

    In fact back then we were still making sure that our games still ran on the original Voodoo 1 cards (2MB framebuffer, 2MB Texture memory) - partly because there were still a significant number of people with that class of card and partly because we still all loved the Voodoo 1 dearly for having been the first really good 3D card...

  • by CronoCloud ( 590650 ) <cronocloudauron.gmail@com> on Sunday September 06, 2009 @05:31AM (#29330015)

    Gaming devices benefit more than anything else from higher quality inputs/connnections and displays. Trust me on this.

    Lets go back to the Atari 2600 days. The machine was shipped with a tv/game switch with 300 ohm spade type connectors. You'd unscrew some terminals on the back of your TV and screwed it back on. This provided a display easily affected by loose connections and interference (herringbone patterns and the like).

    Now you could go to Radio Shack and buy a TV/game switch with standard 75 ohm coaxial connectors. Use that instead 300 ohm one and that herringbone went away and everything looked better, and it was easier to hook up.

    Nintendo's NES came with an automatic game switch with 75 ohm coaxial connection. If you still needed 300, you had to buy a separate thing that IIRC was called a balun. The NES also had composite connectors, which give much better output, if you were lucky enough to have a set or monitor (like one of those philips made commodore monitors) that could accept it back in the mid to late 80's.

    By 1991 composite inputs on TV's were common enough that nintendo included a composite cable by default with the SNES, any RF connection was separate, but they still sold plenty of those since as I've mentioned in this discussion that many game machines are connected to a cheaper, less technically capable set than a family's main set. The SNES also supports S-Video ouput, though back in 1991 it was rarer to find it. I remember going to a TV dealer (back when there were such things) and askes which sets supported S-Video for the upcoming SNES. They said, none and said why would you need that, nothing uses it.

    SNES games look really nice over S-video, good color that doesn't bleed, sharp text. It difference really does stick out.

    The PS1 came with composite connections out of the box and the original version of the machine used actual composite, and S-video connectors on the machine itself, though again, sony sold a lot of RF connection gizmos. Later models switched to what became the standard Playstation multi-out jack. The PS1 is also the first game system I owned that had a few games that worked best with S-video connectors due to font/text issues. (Darkstone, I'm looking at you) S-video made everything look good.

    The PS2 supports component connections, though since TV's with component weren't all that common, the cables themselves weren't ubiquitous. But they were required if you wanted to use certain games progressive scan modes. They also helped color clarity and whatnot on regular games.

  • by Anonymous Coward on Sunday September 06, 2009 @06:33PM (#29334879)

    Console game developer here.

    I think "Fake" is too harsh a word for it. Keep in mind that all xbox 360 games MUST use some kind of full-screen antialiasing technique, because it is a certification requirement from Microsoft. Nearly all of them use the built-in 4x MSAA to satisfy this. So even with a rendertarget with only 600 lines in it, you're getting 2400 samples vertically. After posteffects and the hardware scaling, most games will not have any noticable aliasing artifacts ("jagged edges"). That is, after all, the purpose of that certification requirement--to make sure there are no jaggies.

    The console lets you (the user) select the output resolution you want to send to your TV or other display device. (I use 720p myself, because I use a 1024x768 VGA projector to play games on my wall).

    It then lets the game render at whatever resolution it wants, and the excellent built-in hardware upscaler converts the game's framebuffer into something that can be sent to the display. Most games use a fixed resolution for their framebuffer so that they will have predictable framerate and performance (you wouldn't want your 30fps shooter to drop down to 20fps just because you picked 1080p instead of 720p, would you??). The story on the PS3 is similar.

"If it ain't broke, don't fix it." - Bert Lantz

Working...