Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Games

Game Developers Are Getting Fed Up With Their Bosses' AI Initiatives (wired.com) 76

More than half of video game developers reported their companies are using generative AI in game development, according to an annual survey released Tuesday. The Game Developers Conference (GDC) report found that 52% of developers worked at companies using AI tools, while 30% felt negatively about the technology, up from 18% last year. Only 13% believed AI had a positive impact on games, down from 21% in 2024.

One in 10 developers lost their jobs over the past year, with some reporting extended periods of unemployment. One developer cited in a Wired story said they submitted 500 job applications without success, while another reported being laid off three times in the last year. Covid-era over-expansion, unrealistic expectations, and poor management are being identified as key factors behind the industry's troubles.

Game Developers Are Getting Fed Up With Their Bosses' AI Initiatives

Comments Filter:
  • time to go union! (Score:2, Insightful)

    by Anonymous Coward

    time to go union!

  • I recently heard that companies such as Activision are spending upwards of $700 million [videogameschronicle.com] on AAA titles such as Call of Duty. That can work as long as they can keep selling 30 million copies of the game or more, but there have been a lot of widely publicized flops recently. I forget what Sony spent on Comcord, but that was several hundred million dollars as I recall.

    With budgets that large and uncertainties about recoupingarge investments it's hardly surprising that companies would be looking at generative
    • by gweihir ( 88907 )

      30M copies? Even the absolutely amazing and exceptional BG3 only sold an estimated 15-30M copies. How should a more-or-less generic CoD top that?

      • BG3 was an excellent game, it was the best I'd played in a while.

        • Also a very expensive game - it's very large, it has very extensive voice action more than I've ever seen in any game, it likely took tons of time of actual developer work as well (though copying the engine from D:OS2). That's why they're not doing another one, it's just massive. People bitch and whine about Bethesda taking too long to make their open world games but they really take a very long time to make. There are other expenses, I don't know if it is still true, but Sony used to extract a large perc

          • > That's why they're not doing another one, it's just massive.

            The word I heard is that after BG3 came out, someone (IP owner, distributor, owner-of-the-studio?) wanted microstransactions added to it. The Studio said, "Nope, we're done, we're outta here" in response.

            From some of the reporting, though, I'm thinking that that may be just another rumor. Publicly, they have stated (via IGN [ign.com]) that they "just weren't feeling it", and wanted to pivot to something else.

            Whether that is "just saving face" (for s

            • The request, actually a demand, to add micro-transactions and live service to Redfall is most likely what ruined that game. Enough of a bad rep before release that they apparently removed the micro transactions. Then seemingly it was rushed out, almost certainly an upper level management decision, not from Arkane studios itself. From a really highly regarded studio with awesome games to a studio being brutally closed. And still customers not knowing this say "it had a lousy story, Arkane is a lousy studio.

    • by garett_spencley ( 193892 ) on Tuesday January 21, 2025 @03:12PM (#65107039) Journal

      but this is no different than the power loom putting those who would knit or sew by hand out of business at the start of the Industrial Revolution.

      This assumes that generative AI is an effective tool for increasing productivity. In certain niche applications, this productivity boost is indisputable. But IMO we're still at the very early stages of figuring out what this tech is and is not good at.

      For example, at my current place of work people are using LLMs trained on our [very large] code-base and are starting to use things like Cursor to ask questions of it such as "is there already a function somewhere for this thing I need to do?" These developers are having wonderful results in this area. On the other hand, many of us who have tried to use it to help us write code are seeing a lowering of productivity as a result. Of course we need to explore whether that's ramp up time needed to learn how to use a new tool and break old habits etc. My point is that if you look at what the hype train is promising vs what it can actually deliver there is a pretty massive gap.

      Proponents argue that the tech will inevitably improve and they might not be wrong, but we're talking about the state of the world as it exists *today* vs speculation.

      We're in a recession. That needs to be remembered in these conversations. Now combine that fact with the fact that LLMs are today's hype train, and suddenly you see CEOs boasting about the fact (to their investors) that they can cut labour costs by adopting LLMs. Then you get the recently unemployed who hear this and assume that their jobs were cut because AI has made their position redundant (when they might have been let go because of cuts anyway). And of course you're going to have a bit of both, companies will experiment with LLMs and cut staff in the meantime, and in a couple of quarters down the road we'll start to hear how these initiatives actually worked out in the practice.

      Lastly, and this is a total pedantic thing that I offer purely for conversation, not to nit-pick or argue anything. What I think you're referencing is the Jacquard Loom, which came about during the industrial revolution. It is widely considered to the be the invention that gave birth to the punch card, as a result some even argue that it was the first "programmable computer" since it may have been the first machine that exhibited different behaviour given operating instructions. It was powered and something that I don't think a lot of programmers even know is that it wasn't even technically a new type of loom, but a machine that attached to existing looms.

      I think the difference between the Jacquard Loom and today's generative neural networks, is that the productivity boost represented by the Jacquard Loom was immediate and obvious, while we can only reach that conclusion about GANs across all industries if we take the marketing peoples' use of "AI" at face value. It is and will continue to be useful. But will it be able to improve productivity enough that two jobs can be done by one person across enough industries that it will represent a major disruption? I think we still need to wait and see.

      • by taustin ( 171655 ) on Tuesday January 21, 2025 @03:27PM (#65107089) Homepage Journal

        A better, and more direct comparison is the advent of Photoshop and similar digital graphic programs for generating artwork for advertising. Prior to that technology revolution, artists worked with pen and ink or paint and canvas, and it took a lot longer to produce each piece of art, which then had to be converted into the format needed for the printers. Photoshop reduced production time by probably an order of magnitude, and produced digital files that the printers could read directly. A lot of artists needed new careers, because they physical art skills no longer applies

        Generative AI for images, like Midjourney, is exactly the same transition. Work that used to take hours or days to produce now takes seconds, and skill with Photoshop is less important than skill with writing prompts. The end result is that each artist can produce more images more quickly, and the companies will need fewer artists.

        It's the shape of things to come, and trying to stand in the way of progress is likely to get you run down.

        • A better, and more direct comparison is the advent of Photoshop and similar digital graphic programs for generating artwork for advertising. Prior to that technology revolution, artists worked with pen and ink or paint and canvas, and it took a lot longer to produce each piece of art, which then had to be converted into the format needed for the printers. Photoshop reduced production time by probably an order of magnitude, and produced digital files that the printers could read directly. A lot of artists needed new careers, because they physical art skills no longer applies

          Generative AI for images, like Midjourney, is exactly the same transition. Work that used to take hours or days to produce now takes seconds, and skill with Photoshop is less important than skill with writing prompts. The end result is that each artist can produce more images more quickly, and the companies will need fewer artists.

          It's the shape of things to come, and trying to stand in the way of progress is likely to get you run down.

          Definitely. Yet we still have people here muttering that it's a bunch of nothing.

          • by taustin ( 171655 )

            Text generative AI isn't ready for prime time. The accounts of it making things up are so common they've been given a name, and the nature of those mistakes - a religious AI insisting it was a real priest, gluing the cheese onto your pizza, making up legal precedent complete with quotes, a suicide hotline AI telling callers to commit suicide, etc - makes it very clear that it's not useless so much as dangerously broken.

            Image generative AI is a different beast. Text AI is supposed to report true things but m

      • but this is no different than the power loom putting those who would knit or sew by hand out of business at the start of the Industrial Revolution.

        This assumes that generative AI is an effective tool for increasing productivity. In certain niche applications, this productivity boost is indisputable. But IMO we're still at the very early stages of figuring out what this tech is and is not good at.

        This is very insightful. AI currently and in the future will be able to do or help with some things well and not with other things. Even as AI tools continue to evolve, humans will also evolve in figuring out where and how to use these AI tools. The total productivity increase will come from a combination of improvements with AI tools and human expertise with those AI tools.

      • We're in a recession. That needs to be remembered in these conversations. Now combine that fact with the fact that LLMs are today's hype train, and suddenly you see CEOs boasting about the fact (to their investors) that they can cut labour costs by adopting LLMs. Then you get the recently unemployed who hear this and assume that their jobs were cut because AI has made their position redundant (when they might have been let go because of cuts anyway). And of course you're going to have a bit of both, companies will experiment with LLMs and cut staff in the meantime, and in a couple of quarters down the road we'll start to hear how these initiatives actually worked out in the practice.

        You nailed an important point. A layoff is a sign of failed planning. Cutting jobs due to AI is aligning with the future!!! Why not tell investors we're laying people off due to the productivity boosts due to all your super-successful-AI investments!!!! The CEO didn't fail to plan....The company isn't shrinking...or heartlessly laying off people to make up for their past mistakes...they're preparing for the future!!!! Marc Beinhoff already pulled this stupid shit long ago. Kill 2 birds with one stone..

      • by Gleenie ( 412916 )

        but this is no different than the power loom putting those who would knit or sew by hand out of business at the start of the Industrial Revolution.

        Even better, because the knitting machines get a whole new lease on life making 7-fingered gloves for all the AIs.

      • Dunno.... I'd be more interested in seeing AI as part of the game. Imagine FPS bots that actually learn from humans playing...

      • For me, as a senior developer, "coding" isn't the majority of my time. Right now I've coded up a fix that that was pretty darn easy. But I need to mail people, arrange for access to a certain machine configuration to test it on, then there's the subsequent QA testing which takes time. The coding is really the most minor part of the job (except maybe for the web app continuous delivery people who churn out unnecessary changes in each and every scrum).

    • Indeed, though this sort of AI might help the indies more.

      Personally, I'm playing a sandbox game right now, "Planet Crafter". It has "generated wrecks" for you to explore and loot as a late game option.

      I think that it'd be something that would be excellent to use AI for. Currently, the generated craft layouts make no sense. Combined with using the same corridor components, it becomes very easy to get lost in, because every intersection is identical.

      So, at least for my "sandbox builder" games, I can see s

      • by Calydor ( 739835 )

        We've had procedural generation for decades; all it requires is some fiddling to keep the randomness within certain bounds. For your examples with bridge, medbay etc., simply tell the ship generator to include at least four of these ten room types, have each room type be generated separately with building blocks to fit that kind of room, tweak the generator a few times as you tell it to generate hundreds of rooms to see how they turn out. This does not take AI unless you want to call ALL computer actions 'A

        • I think that I didn't get what I was after across that well.
          Though variances in the rooms themselves is an obvious conclusion and one I'd support, I was actually going for simpler: The hallways between the various rooms are not logically laid out. IE if you viewed the ship from outside, it'd look like a baby playing with legos put it together, maybe worse.
          Think of it like house design - you try to avoid having excess useless hallway space.
          And I mentioned a couple times that it doesn't actually take AI but

    • Digital artists aren't going away, but companies won't employ as many and those who do this work will be expected to know how to make use of these tools as part of their workflow in order to increase productivity and speed up development.

      My graphic artist friend tells me that two-thirds of graphic artist jobs are going away. I'm not sure exactly what fields he means, but it seems as though the process of replacing "rote" graphics work with AI is already underway.

    • by jythie ( 914043 )
      Don't forget : the industrial revolution was a meat grinder. It is great if you were on the land owning side, or looking back through history as a survivor, but not so great for most people. It is comparable to the black death.. sure it increased wages for people who survived, but wasn't a good deal for those who did not.
  • If the antidotal stories can be taken as evidence, that's a 10% slash of developer jobs in 1 industry. Zuckerberg told Joe Rogan ~ 10 days ago ~ that AI producing code will be at mid engineer level within the year. https://www.entrepreneur.com/b... [entrepreneur.com] Whether fact, or just CEO belief doesn't matter, it will be enacted as though real for cost cutting reasons. Within 3 years, AI will eliminate 25%-50% of all programmer jobs. If you're a coder, get out now.
    • ...Within 3 years, AI will eliminate 25%-50% of all programmer jobs. If you're a coder, get out now.

      I disagree. I think within 3 years the AI we know now will no longer be a fad; the boom will have bust because the models are hitting a wall. That being said, I just need to keep my job for the next 5 years and then I can retire. The younger coders, such as my nephew, will have interesting times.

      • Indeed. Models hallucinating in human languages responses is annoying. The same in computer code is fatal.

        Of course if AI is really as smart as the proponents make out people wouldnt need to ask it to write code, just ask it to solve the problem or emulate such and such app directly.

      • That's why we're not CEOs. We're too grounded in reality for that sort of job.

    • by allo ( 1728082 )

      AI will eliminate 25% of the programmers work. Which will lead to more productivity (i.e. better software, more features, faster bugfixes) and not to less developers. The budget to hire developers is still there and if you can use it to have better software, then you use it.

      • by taustin ( 171655 )

        You are clearly unaware of how MBAs "think."

        "Perfection is the enemy of good enough."

        • And "Right Now" is better than "Right".

          "... unless someone is going to sue us, in which case I will throw you under the bus."

      • AI will eliminate 25% of the programmers work. Which will lead to more productivity (i.e. better software, more features, faster bugfixes) and not to less developers. The budget to hire developers is still there and if you can use it to have better software, then you use it.

        Swing and a miss. The budget for "programming" will slowly be swallowed by the promises of AI, and that will *NOT* leave the same budget for hiring actual developers. The sum total of the budget for programming will be split between human and AI until the AI can swallow the whole in those sectors where it works. The IT budget won't increase to include AI alongside humans.

      • you reminded me of this joke. an excited CEO rushes into the room and exclaims hey everybody! with these new tools our productivity twice as high! employees are like, oh great, do we get to be paid twice as much now? do we get to work half hours now? does our product cost half as much now? the CEO is like oh actually this means half of you are getting sacked very soon.

        disclaimer: I have an MBA degree, but I'm not evil

    • by gweihir ( 88907 ) on Tuesday January 21, 2025 @03:02PM (#65107005)

      Zuckerberg told Joe Rogan ~ 10 days ago ~ that AI producing code will be at mid engineer level within the year.

      Hahahaha, no. It can beat inexperienced beginners, but that is about it. It will save better coders some time, occasionally, but only if they are pretty smart and experienced. And there is no sane reason to expect it will get a lot better anytime soon. None at all. Remember this is _old_ tech scaled up. All easy to find tricks are already in there.

      • by jma05 ( 897351 )

        It already makes me several times more productive. The next step of evolution is in inference-time scaling with agents. That will make most of the current app development workflow disappear. You will still need programmers to be responsible for managing the business specific logic, but the framework code should melt into background. That need not mean we will need any less programmers, it just will mean programmers will largely focus on a different set of considerations and have a very different set of expe

        • by fuzzyf ( 1129635 )
          Several times more productive?
          That would be true if you are creating Hello World apps for a living.

          LLMs are impressive, but it's nowhere near being able to code reasonable sized software projects. It can certainly help with scaffolding and plumbing, but it just can't be trusted to do bigger tasks. A developers job is not to churn out code 8 hours a day. A developers job is to transform human language from the customer/employee into features/abilities in the software, and do that in a way that is struc
          • by gweihir ( 88907 )

            Indeed. I guess some people are just projecting their hopes and fantasies and then mistake that for reality. And some other people are doing bullshit jobs and, yes, better bullshit is one of the few things LLMs can do. In the coding-space anybody that does work on that level is probably just doing interfaces, no real code.

            The most important thing that could be added to LLMs is a fact-checking ability. But that is not actually possible mathematically. It would have to be a completely different tech and none

            • by jma05 ( 897351 )

              > In the coding-space anybody that does work on that level is probably just doing interfaces, no real code.

              Very few programmers do "real code". Most code outside core components is just a tedious artifact, interfaces or not.

              > The most important thing that could be added to LLMs is a fact-checking ability. But that is not actually possible mathematically. It would have to be a completely different tech and none of the known approaches can do it outside of _very_ narrow subjects and pretty simple facts.

          • by jma05 ( 897351 )

            These are research and niche projects, not huge amounts of code, no legacy code, no team, no evolving business customer requirements to deal with and I propose most of the feature set - so you might say these are simpler projects than you might have. But these are significant projects and the stake holders are important people. I have been programming for decades, although my job description was never that of a programmer. I find LLMs to be fantastic for my workflow. I would not go back to the world 2 years

        • ...I would not do many advanced things in code because I did not want to introduce complexity.

          Complexity has real-world effects, such as slowness, or increased image size, or increase in failure modes. In contexts where that matters, allowing complexity to increase because "it's easy to do it" is irresponsible.

          I'm not hating on you, okay? I just can't imagine dealing with the code bloat for any of the products that I've ever worked on.

          • by jma05 ( 897351 )

            I used abstract language. I understand that encourages subjective interpretation.
            Those are not the issues I have.

      • I tried to get AI to do some things like create me a script to do some basic backup stuff. I wound up just writing the script from scratch, as it was trying to pass arguments that a program didn't use. It can make some stuff, but the error rate is just too high to be reliable.

        It reminds me of OCR back in the 1990s. It took a lot of manual correction to ensure that the text scanned is right. However, OCR has gotten a lot better, and I assume that AI will get better with code. Right now, for anything oth

        • by gweihir ( 88907 )

          I think the comparison is flawed. With OCR, all the data is there in the input and it was basically a computing power problem only, i.e. relatively early OCR could do it on present levels but got abysmally slow and needed wayyy to much memory. The problem with improving LLMs is much more fundamental.

          That said, maybe something outside of LLM tech could be done. Not in a year of five, but eventually. On the other hand, the classical approach is libraries and that only does not work too well because too much

    • If the antidotal stories can be taken as evidence, that's a 10% slash of developer jobs in 1 industry. Zuckerberg told Joe Rogan ~ 10 days ago ~ that AI producing code will be at mid engineer level within the year. https://www.entrepreneur.com/b... [entrepreneur.com] Whether fact, or just CEO belief doesn't matter, it will be enacted as though real for cost cutting reasons. Within 3 years, AI will eliminate 25%-50% of all programmer jobs. If you're a coder, get out now.

      I think this will be true in the big business tech sector. I'm not sure in other sectors. I know in my very niche programming job the LLMs can't quite seem to help, and in fact seem to be mostly a hinderance. That's thus far. Maybe some giant leap will happen in the coming year to change that, but something tells me common use cases will be tackled long before my particular weirdo cul-de-sac of programming is. I don't know that we'll ever get LLM style AI up to full-fledged programmer status. From what I've

  • by MpVpRb ( 1423381 ) on Tuesday January 21, 2025 @03:19PM (#65107063)

    I predict that eventually it will be developed into something truly useful
    Unfortunately, investors demand profits now, salesweasels aggressively push crappy AI products and clueless managers believe the hype
    Meanwhile, the AI crapfest continues spewing a tsunami of crap

    • by Lehk228 ( 705449 )
      it has plenty of uses, when you just need a quick throwaway illustration to make a blog or article look nice it's fine, and can show you half a dozen variants in like 30 seconds., or as a base that you are going to chop up and do the finishing work with gimp or photoshop. it's like the digital equivalent of stopping at the dollar store for some tinsel and ornaments
  • In 1980, they started the procedurally-generated game maps. Since then, millions of maps have been generated, putting countless dungeon designers out of jobs. How many kobold positioners are destitute?

  • by burtosis ( 1124179 ) on Tuesday January 21, 2025 @03:51PM (#65107199)
    Game developers just want to move to 100% AI to make the games because they don’t care about anything but money and AI means not paying people. Meanwhile, gamers are getting upset with all these micro transactions for things they could farm and would love AI to simply bot for them so it frees them up for doing other things and just popping on and having enjoyable resources. Thus it’s a proof of work and together the system provides the basis for a new imaginary currency!
  • "Getting fed up with your boss's AI initiative is a fun topic! Let's look at some possible ways you could handle it."
  • by silentbozo ( 542534 ) on Tuesday January 21, 2025 @04:03PM (#65107241) Journal

    During covid, there was massive overhiring to lock up talent, to make sure you had it when you needed it, because it was expensive to recruit and find people "just in time", and possibly (for some companies) to deny their competition access to labor. In the games industry, this took the form of acquiring smaller game studios - (many of whom now have had their projects cancelled and since been shuttered.)

    With this whole AI thing, companies are now panicking that they have too many people on the bench if AI hits, so they're shedding headcount to conserve cash, which is now leading to too many people in a certain number of industries, chasing too few jobs. It's not just that they think they can produce more with fewer people - there's a potential threat from smaller competitors being able to outcompete them because the moat of needing hundreds of people to build a complex AAA title is potentially dissolving. (I would argue that the whole idea you needed to go for crazy high end graphics over gameplay is a wrong-headed expression of that need for a moat, but either way, you get the same result - worse return on investment than expected.)

    What these companies aren't taking into account is... every one of these employees is now potentially capable of creating one of those small competitors to compete with their former employers (assuming you're in a state where non-competes are not unenforceable). I don't know if a successful crowdfunding of a game like Star Citizen can happen again, but the free/open source tools available to people with skills and experience using similar tools and workflows in their normal industry, plus the bonus of the (VC subsidized) AI tools means that we could start seeing better product from smaller game/vfx studios soon. Palworld was accused of using AI in their games creation, but was a success nonetheless, using a fairly small team (which scaled up.) Imagine a dozen companies like the team that made Palworld, starting small and scaling up as they gain traction.

    I liken this (and now I'm showing my age) to the desktop publishing transition. People with no prior experience doing layout and design, taking a computer and a printer, and starting off with awful clipart, getting to the point where at a superficial level, they were competing with professionals who were trained in doing layout and design, and spent many years using specialized tools like optical typesetters, wax pasteup, letteraset transfers. However, the trained professionals, once they got a hold of the same tools, were able to really up their game, and a production shop of half a dozen people might have their jobs consolidated into a single workstation and a couple of people doing finishing.

    You can look at this as glass half full for former employees who are willing to make the jump and start their own businesses, or glass half empty, because the people they've unleashed with the skills and experience that they have, aren't cheap labor to potentially rehire later, but potential competition for the same audience.

  • Keep the coders if they can write code that isn't an embarrassment on launch. Fire nearly all of the lazy, overpriced "artists" that make textures and other art. They're like 70% the cost of a video game.
  • I want a list of slop to avoid. my time is precious and my backlog is already big enough. If I'm only supporting CEO salaries then I auto opt out, in so many ways.
  • This technology can let you turn out good code 30% faster or complete slop 200% faster. Guess which one management wants.
  • When people that do the actual job at hand choose to use a tool, it's because the tool is useful for the job. When people that don't do the job choose a tool, it's because they themselves are tool.
  • If your "original" content is AI generated you cannot copyright or patent it. Seems to me this is still incentivizing for companies to employ original thinkers.
  • It makes perfect sense for companies to fire more and more game content creators as the designer can just tell AI what he/she wants, and he/she gets a result in mere seconds/minutes compared to days/weeks from a human creator. Yeah it sucks, but that's just business. Why keep an expensive modeller if AI can get the same result in much shorter time. We didn't keep weavers when machines arrived, we didn't keep people to write a copy of an original when printing press arrived, we didn't keep factory workers wh
    • Play a Dungeons and Dragons game using a LLM as a DM.

      Tell us how much you like the experience afterwards, okay?

  • I'm amazed the amount of tech people that have no clue what an LLM is and how it works. I get why the CEOs of the companies and sales people are just running around with PR bullshit, but tech people? Why?
    Simple facts like LLMs not having state. You feed the same input and it will always generate the same output. You can add randomness to the input of course, but the model is not changing while being used. It's static. Stateless. Having "memory" is just a feature of the client sending in previous dialog (o
  • unlike previous revolutions that take a long time and create new industries, ai doesn't create a new industry for humans. its ultimate goal is to eliminate jobs or reduce how many humans are needed to do them. that's it.

    not really sure how that sounds like a good idea to anyone... including rich people. everything depends on lots of people spending money on stuff and all ai seems designed to accomplish is reducing how many people have money to spend on stuff.

    though, i guess we can just hope we move to a

    • It seems logical that if the total amount of labor required to support (or even increase) our standard of living is decreasing... we ought to decrease the amount of work we do.

      So of course we have no interest in doing that. Just keep trying to overwork an ever-decreasing percentage of the population while cutting the rest out of the economy to 'figure it out'.

      Awesome.

egrep -n '^[a-z].*\(' $ | sort -t':' +2.0

Working...