Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Games Entertainment

Are Console Developers Neglecting Their Standard-Def Players? 200

The Digital Foundry blog takes a look at how the focus on high-quality graphics in console game development may be lost on more gamers than people realize. According to Mark Rein of Epic Games, more than half of Gears of War 2 users played the game on a standard-definition television. While you might expect that dropping the graphics quality would correspond to a boost in frame rates, that turns out not to be the case, and running at standard definition can actually be a detriment in some cases. Quoting: "PAL50 is mandatory for SD gameplay on all games on all European PS3s. You can't avoid running at a sub-optimal 50Hz unless you splash out on a high-def screen. The Euro release of Killzone 2 works at SD resolution on any PS3, even if it can only run at PAL50 on a Euro machine. In short, if you're a Euro PS3 owner playing Killzone 2 on a standard-definition display, you're losing around 17 per cent of the frame-rate owing to the lack of PAL60 support in the PS3 hardware. The game itself isn't slower as such (as was often the case in the Mega Drive/SNES era), and you'll note that it's effectively a sustained 25FPS while the 60Hz versions can be somewhat more variable. But Killzone 2 is already somewhat laggy in its control system and this impacts the feel of the game still further. While there is a 17 per cent increase in resolution, this is far less noticeable than the additional numbness in the controls."
This discussion has been archived. No new comments can be posted.

Are Console Developers Neglecting Their Standard-Def Players?

Comments Filter:
  • Not that I can see (Score:1, Interesting)

    by AuMatar ( 183847 )

    But I have a Wii. One of the reasons I don't get a hd console- early reports of games that ran poorly in standard def. The other reasons are the price (less of an issue now), and the lack of any interesting games (no, FPSes aren't and never have been interesting).

    • by Rayonic ( 462789 ) on Sunday July 26, 2009 @03:35AM (#28825303) Homepage Journal

      and the lack of any interesting games (no, FPSes aren't and never have been interesting).

      Translation: I think all first-person games are the same! Call of Duty is the same as Fallout 3! Aren't I smart and sophisticated?

      Answer: No.

      • by Kokuyo ( 549451 )

        So he can't just not like FPS games (which must all have SOMETHING in common to be categorized in the same group) without you calling him names?

        You're an ass.

  • by jx100 ( 453615 ) on Sunday July 26, 2009 @01:39AM (#28824861)

    I'm currently playing MGS4 on an SDTV and... good god, everything's tiny. It's nearly impossible to read half the material on the screen!

    • Mod parent up!

      I don't need fancy graphics and awesome special effects. They're nice, but I don't need them. I DO need readable text-- for everything. Non-voiced dialog, menu options, etc.

      I don't have the money to buy an HDTV, nor the desire to bring one into a house so likely to see a wiimote thrown straight through it. So, I'm stuck squinting and getting real close to the TV to try and figure out what menu option I'm about to select.

  • by DRBivens ( 148931 ) on Sunday July 26, 2009 @01:44AM (#28824887) Journal

    Understandable, perhaps, by thinking about the mindset of developers in most game companies' labs. Who really wants to be the poor sod with the low-def development gear at his/her desk?

    Any self-respecting geek (myself included!) would rather chew glass than suffer the agony and stigma of working on old gear...

    • by GF678 ( 1453005 )

      Any self-respecting geek (myself included!) would rather chew glass than suffer the agony and stigma of working on old gear...

      Any self-respecting geek should find enjoyment in seeing what their limited hardware can do, as opposed to having hardware with huge amounts of power which gets wasted.

      In limiting your options, you can gain an appreciation for how to optimize, cull the fat/bloat, and so on. I like old gear because there's a skill knowing how to utilise it well instead of throwing money away at newer

      • Re: (Score:2, Interesting)

        I think the parent is talking about those "modern geeks", those superficial, macbook wielding, super star programmers. You know, the cool kind of geek, who know all about the shiny tech the jocks and their girlfriends like to play with. They're real popular nowadays, too, because they can fix your Vista notebook or setup your HD home cinema or even write a witty reply on craigslist for you.

        But don't tell them anything about tinkering with old tech. Why would you play Quake on a TRS-80 when you can play Gear

  • User interface size (Score:5, Interesting)

    by Jared555 ( 874152 ) on Sunday July 26, 2009 @01:48AM (#28824911)

    This combined with the fact that a lot of games don't seem to scale up the user interface very well when using standard definition. Combing SD with a small UI is bad enough, once you reduce the TV size below 27" things get even worse. (Even with the PS2 many games had small enough text that with a lot of (especially smaller) TVs the letters were solid blocks even if you were looking at the TV from 2" away.

    • Agreed - I recently got a chance to try out a 360 on a standard-def screen. I don't think any of the ~5 games I tried had a UI that worked well on it. If the text wasn't downright unreadable, it required an extra second to process because of how little detail it had. If over half the Gears 2 players are on SD, you'd think it would make a little sense to have it looking good for them.
    • by Z00L00K ( 682162 )

      Maybe the real answer is that game consoles aren't really that fancy anyway. If you want extreme graphics you need something better than what a TV can offer anyway.

      Don't forget that the PAL frequency of 50Hz is to avoid interference with the power grid that works at 50Hz. And NTSC runs at 60Hz but has an even lower resolution vertically than PAL, so there is nothing to gain by selecting NTSC instead.

      Anyway - almost every TV set sold today is able to have a better picture quality than the classic PAL or NTSC

    • This combined with the fact that a lot of games don't seem to scale up the user interface very well when using standard definition. Combing SD with a small UI is bad enough, once you reduce the TV size below 27" things get even worse.

      What really irks me is that every UI in console games now has to be in HUGE TEXT just so that it will be readable on an SD screen. Why must HD users suffer because people need to be able to play at low resolutions?

      To make matters worse, a lot of PC games are now ported from co

  • by TheRealRainFall ( 1464687 ) on Sunday July 26, 2009 @01:58AM (#28824941)
    a 500$ TV. You can get a 42" LCD 1080p for 499 at Costco last i checked. I mean why would you have a PS3 and NOT have a 1080p TV? Seems pretty silly. Sell the PS3 and buy a TV and then buy a PS3 again when you can afford it. It's well worth it for watching all the sports in beautiful HD.
    • Re: (Score:3, Insightful)

      by Anonymous Coward

      Well because thats ANOTHER $500 on top of the price of the console for an increase in resolution. I wouldn't pay $500 to run my PC games at, say, 1600 x 1200 vs 1024 x 768 with AA on.

      Also, I think High Def television is one of the biggest rip offs I have ever come across, now that the BBC and Channel 4 are putting things online it's much more convenient just to watch from the PC anyway. Fuck TV, it isn't worth half a grand for an increase in resolution.

      • Standard def is 640x480 interlaced, vs hi def's 1920x1080 progressive. That's 13.5 times the pixels, or a 1350% increase.
        • by julesh ( 229690 )

          Standard def is 640x480 interlaced

          Where I am, standard def is approximately 720x576. Widescreen sets may support 1024x576. But even using your figures, my calculations say that it's only a factor of 6.4 increase. Interlacing reduces the frame rate, not the display resolution.

          • by throup ( 325558 )

            SD widescreen sets are still 720x576. All SD sets will then stretch the image to get the correct aspect ratio. A 4:3 set stretches 720x576 => 768x576; whilst a widescreen set stretches 720x576 => 1024x576.

            • Re: (Score:3, Informative)

              by FireFury03 ( 653718 )

              SD widescreen sets are still 720x576. All SD sets will then stretch the image to get the correct aspect ratio. A 4:3 set stretches 720x576 => 768x576; whilst a widescreen set stretches 720x576 => 1024x576.

              Well not quite. There is no hard limit to the horizontal resolution of an analogue TV - the horizontal dimension isn't divided up into pixels, it is simply a continuously varying signal. If you're driving the TV off RF or composite then the horizontal resolution is restricted by the modulation (high horizontal frequencies will bleed into the chroma carrier, so the modulators will usually need to filter them out). SVideo, RGB and component shouldn't be affected by these limits, so you can drive your TV at

          • omg fail. do you know WHY it reduces the frame rate??! because it's refreshing 2x lower res frames INTERLACED to make them look like a larger, higher res image. this is why 1080i is not as smooth to watch as 1080p. the actual number of pixels making up the original image source is almost half as much.
          • One could argue that by cutting your framerate in half, your TV is only pumping out half the amount of image data that it otherwise would at a given resolution, effectively halving the "usable image."

          • 1) 720x576 may be the spec for an analogue PAL TV, that doesn't mean that's what a console renders at for it's standard def. Wouldn't surprise me at all if it did 640x480 as that is very common in the PC world and they are using PC based graphics chips. Remember that older consoles, like the SNES, worked at extremely low resolutions like 256x256. You don't have to target the spec resolution.

            2) An analogue TV isn't going to resolve its spec resolution anyhow. Unless you've got an extremely expensive studio m

            • Well, now we don't have to deal with that difference. LCD HDTVs are made on the same technology as your LCD computer screen, and their interconnects use the same signaling. With an HDTV, you can have all the same resolution you get on a computer, and new consoles are made to take advantage of it.
              Though they still suck as monitors because they always seem to blur the image noticably, even with filtering settings set to minimum and a signal in thier native resolution.

            • by Khyber ( 864651 )

              "desktop computers didn't use regular PAL or NTSC screens"

              Texas Instruments 99/4A - first computer I ever used/owned, hooked directly to a TV.

          • interlacing by definition means every other scanline.
      • Re: (Score:1, Insightful)

        by timmarhy ( 659436 )
        then your an idiot for buying a ps3 in the first place, since it's whole selling point is high def in the first place. stick to your ps2 and SD.

        and the difference is more then 1600x1200 vs 1024x768, it's 1920x1080 vs 320x240. so unless you have rocks in your head it's a hell of a big step up.

        • Re: (Score:1, Insightful)

          by AuMatar ( 183847 )

          No, the selling point of the PS3 is new games and a more powerful console. HD wasn't a selling point to everyone. Your point would only be valid if every game for the PS3 also came out on the PS2.

          As for a big step up- eh. I can barely tell a difference on TV shows. Haven't done a test on video games, the difference there may be more pronounced, but I (and many others) quite frankly don't care.

          • If you can "barely tell a difference on TV shows", then you're most likely using coaxial for your HD TV, in which case, no, you wouldn't notice much of a difference because coaxial limits you to 480i. However, using component will let you get up to 1080i and HDMI goes all the way up to 1080p. If you are using HDMI and running HD shows at 1080p and claiming you can barely notice a difference, then you need to get to the doctor pronto to find out wtf is wrong with your eyes.
            • I think you're confusing coaxial connections for watching television shows, which can be HD, vs coaxial connections for playing games, which aren't.

              • The person I was replying to specifically mentioned not being able to tell the difference between SD and HD tv shows. =)
          • by Khyber ( 864651 )

            Actually, one of the selling points of the PS3 was the ability to play my PS2 games (I still have a first-gen PS2 that won't play a few brand-new PS2 games.) but Sony fucked that up so many different ways that I just forgot about that and instead decided to buy the PS3 because it had more standards compliance than the 360.

        • by julesh ( 229690 )

          and the difference is more then 1600x1200 vs 1024x768, it's 1920x1080 vs 320x240

          Nothing uses 320x240. Seriously. Standard definition of a PAL TV is (approximately) 720x576.

          • try delacing that so we are comparing apples and apples. oh look it's pretty close to 320x240.
            • Actually the effects are more complicated.

              Interlacing means that each line on the screen is only drawn 25x per second. It does not mean you have only half as many lines. The result is that thin horizontal lines tend to flicker, but it does not diminish the resolution.

              What DOES diminish the resolution is the rather coarse dot pitch and low video bandwidth of old TVs. Especially if you connect to the antenna port on the TV. So you get an image that is maybe as sharp as 640x480 standard VGA on a decent compute

              • by tepples ( 727027 )

                The result is that thin horizontal lines tend to flicker, but it does not diminish the resolution.

                Super Smash Bros. Melee for GameCube and Super Smash Bros. Brawl for Wii have an option to blur the screen vertically (using a [1 2 1]/4 filter IIRC), precisely to get rid of flicker. So yes, there is a tradeoff between flicker-free video and resolution in SDTV.

          • Nothing uses 320x240.

            NTSC pretty much does, per field.

            Seriously. Standard definition of a PAL TV is (approximately) 720x576.

            Composite video signals are made of two frequency bands: luma and chroma. Chroma in PAL occupies the band 4.43 MHz +/- 0.6 MHz. Therefore, luma has to fit in below 3.83 MHz in order not to cause ugly artifacts. Nyquist's theorem states that such a signal can be perfectly reconstructed from samples at a rate of twice the highest frequency, in this case 7.66 MHz. Each scanline is 52 microseconds; 52 * 7.66 = 398 pixels, which is very close to 384x288 which would give square pi

    • by CSMatt ( 1175471 )

      Because most people have already spent all their money on the console?

    • Sell the PS3 and buy a TV and then buy a PS3 again when you can afford it. It's well worth it for watching all the sports in beautiful HD.

      So, 500 for the TV, then more than the same for Sky HD so you actually have anything to watch on it, then 300 for the PS3. You're looking at well over a grand.

  • PAL50 isn't new (Score:5, Interesting)

    by YesIAmAScript ( 886271 ) on Sunday July 26, 2009 @02:01AM (#28824947)

    The issues with 25frames/50fields per second aren't new with the development of HD. Why is someone trying to relate the two?

    50 fields is a lot, you can certainly play fast-paced game with those framerates quite well.

    And Killzone 2's controls are not "already somewhat laggy". It responds just fine on my HDTV. Who comes up with this stuff? Maybe the author has various laggy upscaling systems turned on on his TV (tweener circuits are near ubiquitous on recent PAL TVs since 50Hz is noticeably flickery to a lot of people).

    • by julesh ( 229690 )

      The issues with 25frames/50fields per second aren't new with the development of HD. Why is someone trying to relate the two?

      Because with the introduction of HDTV we now have standards for both 50fps and 60fps that are both actively deployed in the same area, so the issue is now one that's on user-by-user basis, not a country-by-country basis. People who know each other and regularly play on each other's hardware are seeing differences now, not just people who travel from country to country.

      50 fields is a l

      • You've just probably never been exposed to it. You can tell the difference between a 24fps movie and a 60fps movie. In fact, it is more dramatic than you might imagine. I was really amazed at the difference just when dealing with framerate upsampling. Some new DVD software, PowerDVD 9 in my case, has the ability to upsample frame rate as well as pixels. What it does is rather than just displaying a frame multiple times (since monitors are higher than 24fps_, it actually uses various algorithms to calculate

      • Because with the introduction of HDTV we now have standards for both 50fps and 60fps that are both actively deployed in the same area, so the issue is now one that's on user-by-user basis, not a country-by-country basis. People who know each other and regularly play on each other's hardware are seeing differences now, not just people who travel from country to country.

        Sounds like this comes down to people complaining that "this works better on my friends shiny new hardware - I want the same results without having to upgrade my 10 year old hardware". Clearly an utterly stupid complaint.

        Certainly agree here. I've never understood why you would want more than this. Films run at 23.976fps, and you don't get many people complaining that the action is jittery. Most of us can't tell.

        Well, I can certainly see the low frame rate of films compared to TV, particularly on panning shots. (No, 50 fields per second is not the same as 25 frames per second - in normal interlaced TV you do not get a single frame which takes 2 fields to display, each field usually comes from a s

        • by tepples ( 727027 )

          I've long been astounded at gamers complaining that they can only run their game at 190FPS when their friend can get 200FPS - if your display is only running at 72Hz then who cares?

          There are three reasons to demand a higher FPS than your monitor can display.

          • You have less chance of your game dipping from 72 Hz to 36 Hz or lower during complex scenes.
          • You have less chance of your game dipping from 72 Hz to 36 Hz or lower when your visiting friends (who don't necessarily own a console or PC that they can remove from their home) plug in their gamepads and the screen splits.
          • Your game can render the scene twice and average them so that the motion blurs, which makes things look even more re
      • Certainly agree here. I've never understood why you would want more than this. Films run at 23.976fps, and you don't get many people complaining that the action is jittery. Most of us can't tell. I fail to see, therefore, why games should need to run at any higher frame rate -- except for issues of poor design where stuff is only calculated once per frame that may need to actually be calculated more frequently than that.

        Actually, films run at 24 frames per second. 23.976 is what you get when you try and ba

      • It only runs 23.976 when converted to TV.

        And film IS noticeably juddery. We'd be much better served with a 48fps film standard.

    • by moon3 ( 1530265 )
      PAL50 is evil, it feels choppy, there is a night and day difference for many between PAL50 and PAL60.
    • The original poster also seems blissfully unaware that PAL60 has fewer lines than PAL50. The PAL60 standard was created for things like DVD players that are playing Region 1 DVDs and imported VHS tapes. A DVD is encoded as a digital frame with the resolution and frame rate chosen to match either NTSC or PAL. The situation for VHS is similar; the tapes are not encoded in NTSC or PAL, they are encoded in a format that can be decoded into analogue frames matching either PAL or NTSC resolution and frame rate

    • by antin ( 185674 )

      Killzone's controls were laggy, but recent patches have somewhat fixed it. The developer introduced a new 'High Precision' option, which they now enable by default:

      When this option is switched on, it makes the analog sticks more responsive to small movements. Turned off, the controls behave exactly as they did before the patch.

      http://blog.us.playstation.com/2009/05/27/killzone-2-patch-127-details/ [playstation.com]

      I bought the game on release day and I found it a little frustrating at first - the controls did feel slow

  • by Anonymous Coward

    Seriously, we're making console gaming and just watching TV much more difficult than it should be. Standards are important for a reason: it's a basic consumer protection, because no one has the time and money to support all these different formats, and most consumers just want things to work at an acceptable quality. That's always been console gaming's strength: simplicity.

    This is why, when choosing an HDTV, my roommates and I didn't mess around with 720p or 1080i. I don't care about image quality/money

  • Wii (Score:3, Interesting)

    by arazor ( 55656 ) on Sunday July 26, 2009 @02:50AM (#28825135)

    I got rid of my Wii for the opposite reason. It looks like crap on a HD set.

    • Re: (Score:3, Insightful)

      by Ant P. ( 974313 )

      Or is it your HD set's scaler that's crap? Never had a problem with image quality on mine...

    • Were you using component cables and had the video set for 480p? With the Componet cables and 480p it looks pretty good to me. Not like my PS3 but pretty good. Metriod, Smash Brothers, and Punch Out look good for only 2x the hardware of a Gamecube.
      • by arazor ( 55656 )

        Yeah original Nintendo component cable. I actually bought the component cable first because the Wii was scarce back then.

    • I got rid of my Wii for the opposite reason. It looks like crap on a HD set.

      Not really with a component cable, sure it does not look that good, but it is far from crappy, the virtual console looks really crappy but partially because nintendo does not seem to be able to add decent scalers!
      Just for a comparison, I ran recently on the same TV Mario 64 on a good N64 emulator and the game looked really good, all the emu did in comparison to the Wii, was first using the hdmi cable, and then blowing up the resolution and adding anisotropic filtering. On the other hand no scaler helps in h

    • I got rid of my Wii because I like games with some depth and not just waggling a controller around like an idiot.
      Not everyone is a gen y hipster having parties over at their house every night!
      That console is mostly useless for single player gamers (oh and it looked like crap on my HDTV)

  • PAL60? (Score:3, Insightful)

    by julesh ( 229690 ) on Sunday July 26, 2009 @03:11AM (#28825223)

    Why would you expect the PS3 to use some half-assed psuedostandard that not all TVs can actually display? PAL60 is a perversion of the standard that just happens to work on some TVs because the difference between 50Hz and 60Hz is within the tolerance of their hardware. You can't rely on it to work, and even when it does the results might not be what you want.

    Example: my last TV could display PAL60 signals, but the picture ended up squashed in the top 3/4 of the height of the display, its aspect ratio completely distorted and practically unwatchable. If I bought a PS3 and it displayed games like this, I'd return it.

    • Re: (Score:3, Informative)

      by grumbel ( 592662 )

      Why would you expect the PS3 to use some half-assed psuedostandard that not all TVs can actually display?

      PAL60 has been standard feature of a lot of games for almost a decade, quite a few even have it as mandatory requirement (Metroid Prime 2: Echoes on Gamecube, lots of stuff on Xbox360). Its just natural that people expect features in their new console, that they did have in their old ones already.

    • I work in TV/Film and thought I had worked with every crazy framerate/pulldown/interlacing scheme on earth and I had never even heard of PAL60 before (then again I'm from the US so don't work with anything PAL very often). I had to go look it up on Wikipedia thinking perhaps I had missed some big thing new in Europe. I still don't know what it is or why it exists. So it's PAL color with NTSC fields? Who uses that? Who supports that? Why would Kill zone offer it?

      • Re: (Score:3, Informative)

        by Spad ( 470073 )

        Put simply, most games shipped in Europe in the pre-HD days were done with chunky black boarders at the top and botom of the screen to get the same number of lines as NTSC and thus avoid the slowdown issue normally associated with moving from NTSC to PAL. PAL60 is a fudge that allows them to use the whole of the screen without any slowdown and in general it works pretty well if your TV supports it.

      • Re: (Score:2, Informative)

        by supertusse ( 1237022 )

        Also useful for playing NTSC-material on PAL screens to keep the original framerate and avoid having to resample the audio (or change the speed).

    • Re: (Score:3, Insightful)

      by CronoCloud ( 590650 )

      Perhaps you should convince your governments to switch to the NTSC/ATSC standard used in the countries that develop and manufacture the majority of the games. Unless you do so, you PAL folks are always going to be second class citizens video game wise.

      • Who cares about video games - we can watch movies without 3:2 pulldown artifacts and in higher resolution even with SD (except for silly people importing Region 1). And we can watch the three LOTR extended editions back to back in about half an hour (28 mins) less than folks in 60 Hz land.

    • PAL60 is a perversion of the standard

      So is the 288p mode that 8-bit and 16-bit home computers and every game console up through PS1 and N64 used.

      And even if you stick with interlaced signals like those of the Dreamcast, PS2, and newer consoles, there are brazilians of people who would disagree with you. Brazil uses PAL-M [wikipedia.org], which is PAL60 where even the color subcarrier has been moved down to NTSC frequencies.

      • by keeboo ( 724305 )

        And even if you stick with interlaced signals like those of the Dreamcast, PS2, and newer consoles, there are brazilians of people who would disagree with you. Brazil uses PAL-M [wikipedia.org], which is PAL60 where even the color subcarrier has been moved down to NTSC frequencies.

        PAL60 != PAL-M, though both operate at 60Hz.
        PAL-M is 60Hz by design, older PAL-M TVs do not even support 50Hz.
        PAL60 is a non-standard hack of the other 99% PAL standards (G, B etc) which normally operate at 50Hz. Nobody says PAL60 referring to PAL-M.

  • By far the largest issue I have playing games in SD is fonts. Lots of games use fonts which are so small that they are barely legible in SD. The worst offender by far is GTA IV where I can't even read the messages that pop up on the phone.

    I realise HD is the future but if SD support is mandatory (and it is), the frigging game should be playable in SD.

    • Fonts could be a problem even back in the days of the PSone. I own a PSone game, Darkstone, where the fonts are very hard to read if you use composite and not S-Video. Of course, lots of folks said the game was too annoying to play without the PS2's fast loading feature. The two PS2 Hot Shot Golf games have similar issues with some of the smaller typefaces used in the display.

  • by Tridus ( 79566 ) on Sunday July 26, 2009 @04:56AM (#28825611) Homepage

    This bugs me less then the games that it looks like nobody even bothered to test in SD.

    Dead Rising is the most famous example. The text is UNREADBLE in SD. It was pretty fun getting that demo and then not having any idea what the hell to do becuase they just threw a wall of blurriness at you. Lost Odyssey's character status icons were simlarly illegible (but the other parts of the game were okay), and I've seen the same lack of attention to it from lots of other games.

    It's pretty silly. They didn't put on the box "does not function correctly without HD", so I expect the game to at least work on SD. Now since then we've upgraded to HD and things work fine, but it caused more then one game purchase to not happen.

  • I've done development for the Xbox 360, and one of the submission requirements is to have the game properly tested on SD hardware. You can fail a submission if this has not been done, and for that MS do not release the game. It surprises me that it appears there's no such requirement for PS3 games...

  • by Waccoon ( 1186667 ) on Sunday July 26, 2009 @06:17AM (#28825887)

    HD is simply higher resolution, and even budget PC hardware has been able to do HD-comparable resolutions for years. I wish people would stop making excuses that going to HD can result in framerate and responsiveness problems when the real issue is that developers are simply throwing in too many polygons, too many pixel shader effects, using memory for textures instead of the frame buffer, and basically making their 3D engines too inflexible. Oh yeah, and the fonts are too small. Force these people to use an SDTV over a composite cable once in a while, please.

    What next? Benchmark pissing wars? I've already had my fill of PC enthusiasts gloating over 140 FPS with their $600 video cards, completely oblivious to the fact that if the video isn't synced with the 60Hz LCD display, the graphics are actually going to look [i]worse[/i]. Consoles are already showing PC-like issues like frame tearing and no v-sync. Haven't we already fixed these problems in the PC industry?

  • <rant>
    Of course HD doesn't suck in and of itself, but for people with SD sets it does. Most people with SD sets were PERFECTLY HAPPY with an SD picture. It's just that now everyone's throwing HD pictures at SD sets and it turns out looking like crap, either letterboxed (somewhat acceptable) or fullscreened (oy).

    Just ask my relatives that I helped set up with digital converter boxes. Crappy digital pictures and a bunch of crap channels they never wanted in the first place. Oh joy!

    Another example:
  • by wampus ( 1932 ) on Sunday July 26, 2009 @10:18AM (#28827157)

    Are PC developers neglecting EGA users?

Like punning, programming is a play on words.

Working...