PS3 Issues Caused GTA IV Delay? 117
Dr. Eggman writes "According to statements made by Michael Pachter on Gamasutra, 'The Rockstar team had difficulty in building an exceptionally complicated game for the PS3, and failed to recognize how far away from completion the game truly was until recently.' The article goes on to describe an agreement between Rockstar and Sony not to favor the 360 by releasing their version first, necessitating the delay on the 360 as well. Pachter's comments are interesting, because all Take-Two has been willing to say is that 'technological issues' were causing the hold-up. "
Re:Way to spin it into a PS3 problem zonk. (Score:5, Informative)
Wedbush Morgan's Michael Pachter says Take-Two management has "stumbled badly for the first time" with the delay of GTA IV, and said that he believes difficulties porting the game to the PlayStation 3 are to blame and that the company's new green light policy appears to be a failure.
RTFA much? (Score:4, Informative)
"Pachter: PS3 Port Caused GTA IV Delay"
Quoted verbatim from the article (emphasis mine):
"Wedbush Morgan's Michael Pachter says Take-Two management has "stumbled badly for the first time" with the delay of GTA IV, and said that he believes difficulties porting the game to the PlayStation 3 are to blame and that the company's new green light policy appears to be a failure."
The only confusing part is how you missed all that.
Re:Damn PS3's (Score:4, Informative)
But what is this "standard" that you are talking about? In game consoles, there is no such thing. Each console is simply the standard for that particular console, no more, no less.
Re:Way to spin it into a PS3 problem zonk. (Score:2, Informative)
Re:Damn PS3's (Score:3, Informative)
"Dealing" with the SPEs is trivial. An API to take a pointer, a byte count, and a destination to trigger a DMA is trivial to write. Having seven processors all working on a portion of the same problem by communicating and computing efficiently and calling those APIs at the appropriate times is the hard part. So far, that hasn't been solved for the general case (by anyone on any machine) and is still done by using gray matter. To paraphrase the old joke... calling an API is easy... knowing when to call it... that's the hard part.
To say it another way... in parallel programming, data partitioning and data flow are the hard parts of the problem to solve. Once you figure out that, the program kind of falls out based on those things. Knowing how to call an API with the right parameters is trivial. There are no efficient automated mechanisms to solve data partitioning and dataflow for the general cases. Some cases are pretty easy and have solutions (and tools to do it for you) like dataflow pipelines. Other than that, people write libraries to solve certain problems like large sparse matrix solvers, large dense matrix solvers, and such.
So... no... I highly doubt Sony has released tools that make writing parallel programs easy for the general case. It doesn't matter whether or not dealing with an SPE and communication is easy or not. Parallelising an AI algorithm, for example, that's the hard part and has to be done before you even touch the API.