Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
PlayStation (Games) Sony Games

PS4: What Sony Should and Shouldn't Do 406

donniebaseball23 writes "As a follow-up to his piece on Xbox 720, veteran games journalist Chris Morris has put together some thoughtful advice on what Sony needs to do (and needs to avoid) to ensure that the next generation PlayStation is a success. In particular, Morris notes that Sony must 'look beyond games' to create a fully fledged entertainment hub: 'Nintendo has been pretty adamant that it has little interest in content beyond games. Microsoft seems to be rushing to embrace the set top box world. Sony, though, seems a bit confused about what it wants.'"
This discussion has been archived. No new comments can be posted.

PS4: What Sony Should and Shouldn't Do

Comments Filter:
  • by pak9rabid ( 1011935 ) on Tuesday January 17, 2012 @10:16PM (#38733602)

    Which quite ironically, is pretty much just a snes pad with one extra l/r button with a knee jerk reaction to the n64's analog stick.

    Interestingly enough, the PS controllers look like that because before Sony released the PSX, they were working with Nintendo to create a CD-ROM addon for the SNES (much like the Sega CD for the Genesis) [wikipedia.org]. Near the end of the project's completion, Nintendo decided to abandon the idea, when infuriated the president of Sony. Not too long afterwards, Sony came out with the PlayStation to rival Nintendo.

  • by Brain-Fu ( 1274756 ) on Tuesday January 17, 2012 @10:16PM (#38733606) Homepage Journal

    Sony released audio CDs that put rootkits on consumer's PCs, without informing them. After being sued for this, they did it again. They also failed their due diligence on security, causing their entire client base to have private data stolen. Combine this with their habit of selling features and then subsequently removing those very features, and I don't understand why *anybody* buys products form Sony.

    I will never trust Sony again.

  • by mjwx ( 966435 ) on Tuesday January 17, 2012 @10:55PM (#38733914)

    The Wii's sales began to significantly drop several years ago. Last May, sales were down 38% year-over-year and fell to record lows in Japan.

    Nice attempt to frame the argument.

    But the Wii has sold a lot more consoles then Sony and Microsoft. Also remember that Sony had to significantly redesign it's console to stop it haemorrhaging money and Microsoft has also redesigned it's console as well as various versions (Arcade, Elite).

    Wii, 1 production model: 90 million sold.
    Xbox360, 4 production models: 66 million sold.
    PS3, 2 production models: 55 million sold.

    Telling me that sales of the Wii has dropped is simply saying they aren't selling as phenomenally well as they were when it was released, the same thing happened with the Xbox360. This simply indicates it's reached it's saturation point, not an indication of product failure. A slow down in sales after 5 years is normal. The PS3 on the other hand did not experience the majority of its sales after it's redesign.

  • by walshy007 ( 906710 ) on Tuesday January 17, 2012 @11:46PM (#38734220)

    The reason they abandoned it was because after looking at the fine print all game related profits where the media used was a cd would be going to sony... the profit from games was nintendo's bread and butter income.

  • Re:Disagree. (Score:3, Informative)

    by Zeroedout ( 2036220 ) * on Wednesday January 18, 2012 @01:05AM (#38734614)

    It works fine if you set it up correctly:

    1. Make sure you're not too far away, play with the distances.
    2. If the sensor bar is above your tv or below, make sure the right option is set in the options menu of the Wii OS.
    3. Make sure the sensor bar is dead centre.

    Note that excessively bright lights can interfere, so try different things out that work for you. I had to move my couch closer to the TV as it's regular position wouldn't give me accuracy I desire.

  • by JDeane ( 1402533 ) on Wednesday January 18, 2012 @03:33AM (#38735338) Journal

    Ummm no, just no on so many levels....

    The PS3 has 1 Cell Broadband Engine with is a single CPU based on the PPC CPU, and 7 SPU's the 8th SPU is disabled at manufacture to increase yields. They run a tests on the chip, notice one SPU is bad, isolate it then disable it. If all of them are good they disable one.

    Only 6 SPU's are usable to programs as the 7th SPU is reserved for the OS's Hypervisor.

    http://en.wikipedia.org/wiki/PS3_Hardware [wikipedia.org]

    (To access Wikipedia just refresh the page when you see the SOPA thing and hit stop before the redirect.)

  • by Anonymous Coward on Wednesday January 18, 2012 @06:22AM (#38736072)

    Are there any good games for the Wii? I own one but have probably spent less than 4 hours playing on it.

    Not sure if trolling, but let me list a few;
    The Legend of Zelda: Twilight Princess and Skyward Sword
    Metroid Prime Trilogy
    Metroid: Other M
    Super Mario Galaxy 1 and 2
    Zack & Wiki: Quest for Barbaros' Treasure
    Rayman Origins
    WarioWare: Smooth Moves
    All the Mario Parties
    Resident Evil 4
    Donkey Kong Country Returns
    Goldeneye 007
    Kirbys Return to Dreamland
    Madworld
    Muramasa: The Demon Blade
    New Super Mario Bros
    No More Heroes 1/2

    And that's only the stuff I could think off the the top of my head.

  • by Miamicanes ( 730264 ) on Wednesday January 18, 2012 @11:51AM (#38738572)

    > The big joke of the last 3 generations is that Nintendo has put together under-performing hardware

    You're overlooking the 400-ton elephant wearing a pink tutu standing over in the corner -- 1080i60. As much as we'd like for it to be true, native 1080pAnything is far from universal. You'd be horrified if you saw the architectural mess inside most mass-market sub-$400 LCD TV controller ASICs -- it makes the parallelport-semi-SCSI-kludged-to-USB trainwreck that evolved with scanners look downright elegant by comparison.

    The raw panels themselves can do 24p, 30p, and 60p without drama, but the brain-damaged controllers driving them were value-engineered to just kludge anything besides 720p60 the same way they always have -- they bob it (ie, they treat 1920x1080 16:9 interlaced video as 1920x540 16:9 progressive video, then resample it to 1366x768).

    When presented with 1080p24, instead of just natively showing it at 24fps, they stupidly apply 3:2 pulldown to emulate 1080i60 and pass it to the same braindamaged controller. I've seen cheap LCD TVs that somehow managed to end up with weave artifacts out of 1080p30 source. And today's Walmart crap is the semi-high-end from 5 years ago.

    Put another way, it's going to be at least another 10 years before you can confidently throw out 1080p60 video and expect butter-smooth artifact-free rendering on the TVs in most living rooms. With current TVs out "in the wild", modes like 1080p24 and 1080p30, let alone 1080p60, are too inconsistently-implemented to risk depending on... and true 1080i60 looks like crap on anything besides a 240hz set that uses oversampling to emulate interlace fade. So we get the least common denominator -- 1080p30 pretending to be 1080i60, that 10-20% of TVs still manage to screw up and butcher.

    Of course, 720p60 works well on just about everything. Unfortunately, 720p60 isn't sexy enough for the marketing department. So instead of getting judder-free butter-smooth 1280x720 60fps video without glitches, and with enough filtering to be almost indistinguishable from real-life, we end up with 1080i60 video that looks like crap.

    That's the sad truth. 720p60 isn't good enough for the marketing department, 1080i60 rendered AS 1080i60 looks like crap on most TVs. 1080p60 is a fantasy in 94% of the homes in America, 1080p24 is badly-implemented in at least a quarter of the TVs out there, and 5-10% somehow manage to even screw up 1080p30 encoded as fake 1080i60.

  • by Miamicanes ( 730264 ) on Wednesday January 18, 2012 @05:06PM (#38741710)

    Oh... just to add... in case anybody is wondering why interlacing is a NEW problem, it's because old videogames and home computers tricked TVs into scanning EVERY field as if it were an "odd" field, instead of scanning odd fields, then even fields. That's why they had black scanline artifacts between every other row of pixels. In effect, they tricked CRT TVs into pseudo-progressive 60fps mode by scanning the same field over and over again & leaving the even field's scanlines dark, instead of alternating between the odd and even fields (refreshing each 30 times per second).

    There's a bigger problem lurking with "true" HD video -- 50 & 60 fps isn't fast enough to climb out of the "Uncanny Valley". It turns out, the white papers written back in the 80s and 90s were biased by the physical behavior of film and CRT displays, and made lots of assumptions that fall apart when you're talking about an inherently progressive display like a LCD, and source video that's basically rendered to digital film of infinite sensitivity one frame of infinitely-short duration at a time. It turns out, motion blur encodes a hell of a lot of extra visual information into each frame that naive attempts to apply Nyquist to synthetic video content fail to appreciate.

    The PC videocard industry and gaming industry started to become painfully aware of the problem about 10 years ago, and they're still working on it. The current band-aid is to simulate motion-blur... but motion blur itself becomes visually-tedious after a while.

    Between glasses-free 3D and refresh rates fast enough to make motion-blur unnecessary, there's still plenty of room for future advancement in videogame and TV technology. We have a long, long way to go before you'll be able to dress up a monitor like a fake window & feel like you're looking outside at a real scene... made longer by the fact that the advancements needed to take videogames to this level go WAY beyond what mass-market consumers are likely to care about, much less demand, for TV-watching purposes. This means the quantum leap that occurred over the past 10 years is more of a fluke than anything, and isn't likely to be sustainable in the long run.

    At some point, the cost burden is going to shift from mass-market consumers subsidizing the technology through billions of general TV sales back to mere millions of high-end gamers driving the market for outrageously expensive (compared to what you'd pay for even an *expensive* TV at Wal Mart) cutting-edge hardware. Think: the gigantic cost leap seen today when you go from 1920x1080 to any higher res (like 2560x1920), but even worse.

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...