Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Sony Nintendo PlayStation (Games) XBox (Games) Games

Next-Gen Console Wars Will Soon Begin In Earnest 284

When the Wii U was released at the end of last year, Nintendo got a head-start on the long-awaited new generation of video game consoles. Now, Sony has announced a press conference for February 20th that is expected to unveil the PlayStation 4, codenamed 'Orbis.' This will precede the announcement of the Xbox 360's successor, codenamed 'Durango,' but that too will likely be announced by E3 in June. Specs for development kits of both systems have leaked widely. The two systems both use 8-core AMD chips clocked around 1.6 GHz. Durango has 8GB of DDR3 RAM, while Orbis has 4GB of GDDR5 RAM, though Sony is trying to push that up to 8GB for the console's final spec. Reports also suggest Sony is tinkering with its controller design, going so far as to add a "Share" button to let people exchange screenshots and recordings. Developers indicate the systems are very close in power, though Sony's system currently has an edge. With the upcoming announcement of the PS4, the big-three console makers will kick off a new round of direct competition. They'll maneuver to one-up each other with the most powerful hardware and the slickest software. However, they'll also hope the release of three major consoles in rapid succession will help to anchor a part of the games industry that no longer enjoys the dominance it once did, thanks to threats from mobile.
This discussion has been archived. No new comments can be posted.

Next-Gen Console Wars Will Soon Begin In Earnest

Comments Filter:
  • by The Living Fractal ( 162153 ) <banantarr@hotmail . c om> on Saturday February 02, 2013 @01:42PM (#42771969) Homepage
    Nintendo is not really about technology as much as innovative, fun games, for the whole family.
  • by alen ( 225700 ) on Saturday February 02, 2013 @01:50PM (#42772019)

    blu ray is not a sony tech, the consortium is over 300 companies including apple and microsoft. i don't even think sony owns most of the patents. they were just an early producer of the technology and wanted to push HD for their own profits

  • by Anonymous Coward on Saturday February 02, 2013 @01:58PM (#42772077)

    After all the shit Sony has pulled, who's gonna buy something new from them? Seriously their list of customer abuse is long...

    Time to let Sony die.

  • by FriendlyStatistician ( 2652203 ) on Saturday February 02, 2013 @02:06PM (#42772119)

    Perhaps you've forgotten that Microsoft backed HD-DVD against Blu-Ray [].

  • Re:1.6 ghz? (Score:5, Informative)

    by The Optimizer ( 14168 ) on Saturday February 02, 2013 @04:32PM (#42773063)

    As someone with some game development experience, let me throw in some observations. (*based on the specs mentioned here).

    The 3.2 Ghz Power PC CPUs in the Xbox 360 and PS3 were in-order execution units. As I remember, code on the 360 typically executed about 0.2 IPC -(Instructions per cycle), sometimes worse. The very best hand optimized assembler doing tasks like video decoding could execute about 0.9 IPC once properly cached and unrolled.

    AMD and Intel have decades of R&D now into out-of-order x86 execution (the x86/x64 opcodes being translated to internal micro ops), which is a major factor in their performance. Even the Power PC G5 chip devoted a good chunk of its silicon to Out-or-order execution. The 360 and PS3 CPUs - designed almost 10 years ago - traded Out of Order execution for die size and clock speed.

    The specs say that the 1.6 Ghz CPUs can issue up to 2 instructions per cycle. If real world performance works out to an IPC of 1.2 to 1.6, which seems very doable, then you will see a 3x to 4x increase in the real-world rate of instructions being performed . ( 0.2 IPC @ 3.2Ghz == 0.4 IPC @ 1.6Ghz ). This doesn't take into account any efficiency gains due to the instruction set, cache, etc.

    And at the same time, I would imagine it's a whole lot easier to deal with other things on the chipsets at 1,.6Ghz than at 3.2 Ghz (mature tech and all that)

If graphics hackers are so smart, why can't they get the bugs out of fresh paint?