Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics OS X PC Games (Games) Software Games

Square Enix Pulls, Apologizes For Mac Version of Final Fantasy XIV 94

_xeno_ writes: Just over a week after Warner Bros. pulled the PC version of Batman: Arkham Knight due to bugs, Square Enix is now being forced to do the same thing with the Mac OS X version of Final Fantasy XIV (which was released at the same time as Batman: Arkham Knight). The rather long note explaining the decision apologizes for releasing the port before it was ready and blames OS X and OpenGL for the discrepancy between the game's performance on identical Mac hardware running Windows. It's unclear when (or even if) Square Enix will resume selling an OS X version — the note indicates that the development team is hopeful that "[w]ith the adoption of DirectX 11 for Mac, and the replacement of OpenGL with a new graphics API in Apple's next OS, the fundamental gap in current performance issues may soon be eliminated." (I'm not sure what "the adoption of DirectX 11 for Mac" refers to. OS X gaining DirectX 11 support is news to me — and, I suspect, Microsoft.) Given that the game supports the aging PS3 console, you'd think the developers would be able to find a way to get the same graphics as the PS3 version on more powerful Mac OS X hardware.
This discussion has been archived. No new comments can be posted.

Square Enix Pulls, Apologizes For Mac Version of Final Fantasy XIV

Comments Filter:
  • DirectX 11 for Mac (Score:2, Insightful)

    by Anonymous Coward

    Wait what?????????

    • by adler187 ( 448837 ) on Friday July 03, 2015 @03:43PM (#50040635) Journal

      Probably referring to http://boilingsteam.com/codewe... [boilingsteam.com]

    • by beelsebob ( 529313 ) on Friday July 03, 2015 @04:32PM (#50040841)

      No, Metal for Mac, which is a graphics API that works in a similar way to D3D 11/12. i.e. not a state machine, but instead issuing buffers of commands based on pre-verified states.

    • by Anonymous Coward

      The submitter can't even read the article right. How hard is it read nowadays?

      From the 2nd link in the OP's post.

      As Kasuga, our technical director and lead programmer, explained some days ago, the Mac version of FFXIV has been developed especially for Mac systems. It does not use a boot utility such as Boot Camp to run via a Windows OS on Mac hardware. Rather, it works by employing middleware developed by TransGaming (presently NVIDIA) to convert Windows’ DirectX drawing method into OpenGL on Mac syst

      • So, it's a terrible port. Sounds like business as usual for Square.
  • by Anonymous Coward

    They probably just ran into a million issues on OS X and its implementation of OpenGL and Apple doesn't give a shit.

    I also never heard of DX11 on OS X. I imagine he must be referring to Bootcamp, although I don't know the state of Apple's drivers for bootcamp.

    I guess they could have just not released the game in the first place instead of pulling it later...

    • by Anonymous Coward

      Transgaming (now nVidia)'s Cider implementation was garbage, and the version they let the players download was "unfinished".

      We've been having this argument since Early Access about the Mac Client, and basically once early access ended, a lot of "this is stupidly poor performance, Boot Camp works better.... even f*cking Parallels works better."

    • Re:Why release it? (Score:5, Informative)

      by carlhaagen ( 1021273 ) on Friday July 03, 2015 @04:24PM (#50040803)
      I've programmed portable OpenGL-based applications for many years for the three dominant desktop platforms - Windows, Linux, OS X - and I have no idea which million issues on OS X and its implementation of OpenGL it is you refer to.
      • The millions of issues it has are the dollars he's not getting because he hired shitty devs.

    • They're talking (badly) about Metal, which is a graphics API that's much more sensible than OpenGL (i.e. doesn't involve a bunch of state changing, and unverified states).

    • The DirectX 11 implementation they're referring to is built on top of OpenGL.

    • Re:Why release it? (Score:4, Informative)

      by jo_ham ( 604554 ) <joham999@noSpaM.gmail.com> on Friday July 03, 2015 @05:16PM (#50041039)

      You don't need Apple's drivers for bootcamp for the GPU - you can just install the AMD or Nvidia ones that AMD and Nvidia supply for windows.

      The one Apple ships with the bootcamp driver package (that you install from a USB stick when you first set up windows and has everything you need for the keyboard, networking, bluetooth, etc) includes one of those OEM drivers from AMD or Nvidia, it just tends to be an older one since they don't update the package all that often.

      Once you have windows installed though it's no different to any other windows machine in terms of GPU drivers.

    • by tlhIngan ( 30335 )

      They probably just ran into a million issues on OS X and its implementation of OpenGL and Apple doesn't give a shit.

      I also never heard of DX11 on OS X. I imagine he must be referring to Bootcamp, although I don't know the state of Apple's drivers for bootcamp.

      I guess they could have just not released the game in the first place instead of pulling it later...

      No, they basically recompiled their app using a Windows API library.

      There are lots of Windows API libraries - like WineLib - where you take your Windows

  • by Viol8 ( 599362 ) on Friday July 03, 2015 @03:47PM (#50040653) Homepage

    Sounds to me like they Driect X coders who don't know how to code for OpenGL properly but instead of fessing up they decided its easier to blame their tools than themselves. Poor workmen etc...

    • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Friday July 03, 2015 @03:49PM (#50040663) Homepage Journal

      Even though the GPU makers focus on D3D and not OGL the GL performance is usually quite close to D3D. So when people start blaming their woes on OpenGL I start assuming they don't know what they're on about.

      • by vux984 ( 928602 )

        Yet it appears to run ok on the Playstation 3 and 4. So... maybe its something to do with OSX specifically rather than them hiring coders who don't know OpenGL.

        I figure the statement from the engineering team got run through too many marketing and legal drones and the message that ultimately got released to the public is just word salad.

      • by Anonymous Coward

        If you read the article, it mentions that it uses TransGaming's middleware - i.e. Cider which is essentially their commercial version of WINE, which would explain performance issues.

      • by jo_ham ( 604554 )

        They are just blaming it on Apple's lack of effort on the OpenGL front. They're hardly pushing the boat out on it. Either way, it's no excuse for such a poor port.

      • Batman is optimized for nvidia and uses Nvidia gameworks designed to run poorly on ati hardware which Apple uses.

        Gameworks is a d3d extension framework too and as far as I know only runs on Windows. Also they can ignore ATI issues on the PC port as everyone uses nvidia and likely just blame ATI and have fellow gamers tell them you should have gotvnvidia which is what nvidia is hoping for.

        • ATI has also been dicking around with mantle, anyone could have told them we didn't need another proprietary standard and now they have announced they are not going to be improving it in the future. So hopefully they will return to their core mission of making drivers that people will actually use. They're not very good at it, so I understand why they'd want to work on a research project that no one will use, but I'm sure it's a bit frustrating for the people who haven't learned not to buy ATI graphics card

    • by vux984 ( 928602 )

      Sounds to me like they Driect X coders who don't know how to code for OpenGL properly

      And how on earth does that explain it running on PS4... AND PS3? Since when is the PlayStation a directX platform?

      https://store.na.square-enix.c... [square-enix.com]

      • by beelsebob ( 529313 ) on Friday July 03, 2015 @04:42PM (#50040887)

        That would be because the PS3 and PS4 use sony's proprietary graphics API that looks nothing like OpenGL.

        The OpenGL API contains various features that are simply not conducive to writing either a fast implementation of the standard, or a fast application that uses it. The two main issues are:

        1) That OpenGL is a state machine, draw calls are issued at arbitrary moments when in arbitrary states. This means that the implementation can't validate that the draw call was made in a valid state until you actually make the call. That doesn't sound like much, but it actually turns out to be a major headache. It means that compiling shaders can end up delayed until you actually make a call because you don't know what vertex formats it'll read, what blending modes it'll use, etc. It means that uploading data can be delayed until you make a call because you don't know what format it needs to be in. It means that blobs of data can't be placed in the right area of memory because you have no knowledge of whether the memory needs to be for fast reading only, fast read and write (only on the GPU), pulling off the GPU onto the CPU etc.
        2) That lots of OpenGL operations are explicitly thread safe, and there's no way to tell OpenGL about the fact that two operations won't interfere with each other. Want to overwrite an area of a texture for the next frame while the previous frame was rendering because you have knowledge that the two won't try to read and write the same area at the same time? Nope, tough shit, can't be done. Uploading the texture will block waiting for the GPU to finish rendering with it.

        Apple acknowledges that these are problems, and as a result, they've made their own graphics API (Metal) which is much more similar to how D3D and Sony's proprietary APIs work. Thankfully, the next OpenGL spec (code name Vulcan) will head towards this way of doing things, and maybe we can get back to the standard open way of doing things being reasonable.

        • by vux984 ( 928602 )

          That would be because the PS3 and PS4 use sony's proprietary graphics API that looks nothing like OpenGL.

          Check the thread context.

          I never said PS3/PS4 use OGL. I was countering the argument that "it must because be the developers only know directX and are now blaming their tools".

          The point stands that the problem is in fact specific to OSX and OpenGL and is NOT the fault of the developers only being competent with DirectX.

    • by alexhs ( 877055 ) on Friday July 03, 2015 @04:18PM (#50040773) Homepage Journal

      I wouldn't blame the coders, unless they where responsible for the technology choices

      It's first and foremost a management issue :

      However, in the chaos leading up to the multi-platform launch of our expansion, we released incorrect requirements, which were not updated prior to the Mac version’s official release.

      However, due to our miscommunication with retailers, the Mac version was made available earlier than intended. As a result, some customers were able to download and play a pre-release build which suffered from performance problems.

      If that's not management rotten to the core, what is ?

      Rather, it works by employing middleware developed by TransGaming (presently NVIDIA) to convert Windows’ DirectX drawing method into OpenGL on Mac systems.

      Any company relying on Microsoft technology to achieve cross-platform deserves a spectacular failure anyway.

      • by _xeno_ ( 155264 )

        If that's not management rotten to the core, what is ?

        Final Fantasy XIV is kind of the poster child for bad management at Square Enix, to the point where they actually fired the original management team. This new fiasco is from the team hired to replace the original team.

        Any company relying on Microsoft technology to achieve cross-platform deserves a spectacular failure anyway.

        Which makes no sense, because they've already ported the graphics engine twice! The game also supports the PS3 and the PS4. If they can deal with three different graphics engines, you'd think adding a fourth would be no big deal.

        • If they can deal with three different graphics engines, you'd think adding a fourth would be no big deal.

          Think about digging a hole for a basement.
          After you have done 3, the 4th is the same work. You only know ore about making breaks, drinking water etc.

    • by tk77 ( 1774336 ) on Friday July 03, 2015 @04:27PM (#50040813)

      Whats worse is that it appears they weren't even developing for OpenGL, but rather using Transgaming's (nVidia) cider to translate DX calls to OGL.

      "...it works by employing middleware developed by TransGaming (presently NVIDIA) to convert Windows’ DirectX drawing method into OpenGL on Mac systems."

      They then go on to compare OGL and DX and claim that if it was developed natively for OGL there would be a 30% performance gap. Excuses for laziness, in my opinion.

    • by Anonymous Coward

      How true, recently in my last job I had to port an application from DirectX to OpenGL (actually more like from Windows to Linux and Mac), their internal developed port was disappointing in the 3D quality and performance, but then my port was performing in all aspects better than the DirectX version. They where surprised by the result and wen they asked me how that was possible I simply told: "that is the difference between an OpenGL specialist and a programer that know hot to use DirectX", I admit that some

    • by Damarkus13 ( 1000963 ) on Friday July 03, 2015 @08:00PM (#50041705)
      They didn't even try. They slapped a compatibility layer (Cider) on their DX11 engine and now are acting shocked that the performance is terrible. Sounds to me like management looked at the cost of licensing Cider vs. the cost of actually writing an OpenGL engine. It's probably not the workmen's fault.
  • by Anonymous Coward

    Most other developers seem perfectly content developing with OpenGL.

  • by Anonymous Coward

    The port uses Cider, a thing similar to wine. The funny thing is that the game run better on a virtual machine running on OS X than the cider version. So the issue is OS X, but the crappy cider layer.

  • This is all rather fishy, I am wondering if Steam is actually curating big releases for quality and taking the thing out of steam by themselves and only allowing the publisher/developer to make it seem like it was their own decision all along.

    • FF14 doesn't require Steam. It's on Steam, and you can get it there, but if you buy it anywhere else Steam isn't used at all.

      And yes, they halted sales through the Square-Enix webstore and other retailers as well.

  • by Anonymous Coward

    Mac OS is horribly broken as a gaming operating system and basically all games have issues. I've played Kerbal Space Program, Crusader Kings, Planetary Annihilation, Diablo 3 etc. All these games run very nicely on Windows, but everyone one of these exhibit some kind of quirk or bug. Fullscreen or windowed, it doesn't matter.

    I have a MacBook Retina which I use for development, it has a 2880x1800 screen, but to you need to go into 'Display' and set it to Scaled and 'More Space' in order for it to render like

    • by maccodemonkey ( 1438585 ) on Friday July 03, 2015 @05:23PM (#50041077)

      I have a MacBook Retina which I use for development, it has a 2880x1800 screen, but to you need to go into 'Display' and set it to Scaled and 'More Space' in order for it to render like it has a 1920x1200 screen. So non retina applications the OS reports the resolution to be 1920x1200, then it upscales the application window to 3840x2400 and also does composition at 3840x2400, which is then downscaled to 2880x1800 and displayed. Performance gets even worse if you also have an external screen, because and it also does the same if you have an external 3840x2160 and downscales to 1920x1080. Basically you add an external 1080p screen and it will try to make the integrated graphics render graphics for basically two 4k screens.

      When you run in scaled mode like that, ALL applications over-render. Both retina and non-retina. It's why I really suggest people avoid the scaled modes.

      Non-scaled displays do not scale. I've verified that at work. So external displays do not over render unless you have a 4k display and you've put it in a scaled mode.

      Games are actually one exception. A full screen OpenGL game gets to directly output properly to the screen. Full screen OpenGL doesn't get scaled or over rendered. I've verified this on multiple Apple platforms with OpenGL code of my own. It means on a device like the 6 Plus where scaled output is normal (the 6 Plus has a 1080p screen but has a much higher res frame buffer) OpenGL performance isn't degraded. I even have non-full screen OpenGL code that doesn't get over-rendered either.

      My guess is that none of the original article has to do with scaling at all. It's likely they're using something like Cider that abstracts DirectX calls to OpenGL, and has always really sucked for performance. (EA did several ports with Cider and they all had severe performance issues as well.) OpenGL on the Mac also just has general issues.

    • by Anonymous Coward

      Hmm, I'm most certainly not a fan of Apple or Mac, but I'm pretty much a fan of logic.. And using games specifically written for one platform, and then being extremely badly ported to others as evidence for these platforms being "horribly broken" seems like an example of, well, pretty horribly broken logic. I guess you could figure out how if you think enough about it. Preferably before you post next time.

      It would seem far more logical to me that nobody actually cared about the Mac port, nobody involved rea

  • How retarded is that?

    Just because M$ is the market leader suddenly there is a DirectX for Macs? Never heard about that before.

    And game vendors prefer DirectX over OpenGL?

    One of the reasons I don't work in the game industry ... I would vomit before going to work, several times in between and several times after.

    How anyone think she can program a game for DirectX and then port it to OpenGL instead of the other way around or simply only doing OpenGL is beyond me.

    • Compared to DirectX, OpenGL is a terribad API to work with.

      If you are using an engine such as Unreal or Unity with multiple back ends, then OpenGL becomes somewhat feasible. Otherwise developers are better off choosing DirectX and going Windows only, targeting 95% of the gaming PC market.

      • by _xeno_ ( 155264 )

        But that's the thing - the game in question uses a custom in-house graphics engine written to support the PC, PS3, and PS4. They're already maintaining three separate rendering back-ends, including one that's intended to target a console that's nearly a decade old.

        I find it really, really hard to believe that they can't get a game that's designed to be playable on the PS3 to run on modern Mac hardware.

      • Care to explain what is so horrible with OpenGL?

        After all I have plenty of OpenGL games, like Decent ... seems the developers at that time did not find it horrible at all.

        Ten years ago everyone was complaining about the horribility of DirectX ... I wonder when and why that should have changed.

  • race to the bottom (Score:3, Insightful)

    by Anonymous Coward on Friday July 03, 2015 @06:29PM (#50041365)

    I'm noticing a trend here, all these high profile, multimillion dollar budget games that run on the latest h/w all have major bugs. And I mean major bugs that ruin the experience. With these sky rocketing project budgets is kinda odd.

    Could it be that big budget games are getting pressure by their iOS/Android counterparts (hence rushed and over hyped?) ? Sounds similar to youtube/shorts/webisodes vs scripted major network content.

    Or is it that with all the great tools and superstar coders mean nothing: rushing a major game to market (within 1 yr) has hit its limits, games take 12-16months for example. Otherwise expect some level of failure. No different from a Pixar movie that typically takes 4yrs, compared to a 35 day Transformers shoot..... time == quality.

    • Their new DX11 client launched without a hitch, though - some people's FPS went up even as their GPU usage went down, and the graphics are a bit prettier.

      Most of the people demanding a Mac native client were already playing the game - they just wanted something that they could play and also alt-tab out to surf the web and check email and crap the way the Windows people do.
  • They tested the mac version on a mac running Windows, or as I like to call it, A WINDOWS PC. What about that was supposed to work? I think I know where they went wrong: designing a game for a mac.
  • Looks like this has been said in previous comments - If you remember TransGaming's Cedega for Linux, a closed version of Wine that was considered reasonable competition to the open source version way back in the day, then you've now heard of TransGaming's Mac port of it. It's also considered a bit outdated as well, as current builds of Wine seem to work better on OS X for running games. I remember EA went through a phase of using it and the end results were pretty terrible. I'm amazed that Square Enix actua
  • Now they should apologize for *all* versions of Final Fantasy XIV.
    • They apologized for 1.0 because it was bad. But 2.0 was pretty good, and the new expansion (dubbed 3.0) is so crammed with fan service that we decided that they're never going to remake 1-6, they're just going to put all the contents of those games into XIV and call it done.

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...