Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Games

What Happens When You Put 25 ChatGPT-Backed Agents Into an RPG Town? (arstechnica.com) 52

"A group of researchers at Stanford University and Google have created a miniature RPG-style virtual world similar to The Sims," writes Ars Technica, "where 25 characters, controlled by ChatGPT and custom code, live out their lives independently with a high degree of realistic behavior." "Generative agents wake up, cook breakfast, and head to work; artists paint, while authors write; they form opinions, notice each other, and initiate conversations; they remember and reflect on days past as they plan the next day," write the researchers in their paper... To pull this off, the researchers relied heavily on a large language model for social interaction, specifically the ChatGPT API. In addition, they created an architecture that simulates minds with memories and experiences, then let the agents loose in the world to interact.... To study the group of AI agents, the researchers set up a virtual town called "Smallville," which includes houses, a cafe, a park, and a grocery store.... Interestingly, when the characters in the sandbox world encounter each other, they often speak to each other using natural language provided by ChatGPT. In this way, they exchange information and form memories about their daily lives.

When the researchers combined these basic ingredients together and ran the simulation, interesting things began to happen. In the paper, the researchers list three emergent behaviors resulting from the simulation. None of these were pre-programmed but rather resulted from the interactions between the agents. These included "information diffusion" (agents telling each other information and having it spread socially among the town), "relationship memory" (memory of past interactions between agents and mentioning those earlier events later), and "coordination" (planning and attending a Valentine's Day party together with other agents).... "Starting with only a single user-specified notion that one agent wants to throw a Valentine's Day party," the researchers write, "the agents autonomously spread invitations to the party over the next two days, make new acquaintances, ask each other out on dates to the party, and coordinate to show up for the party together at the right time...."

To get a look at Smallville, the researchers have posted an interactive demo online through a special website, but it's a "pre-computed replay of a simulation" described in the paper and not a real-time simulation. Still, it gives a good illustration of the richness of social interactions that can emerge from an apparently simple virtual world running in a computer sandbox.

Interstingly, the researchers hired human evaluators to gauge how well the AI agents produced believable responses — and discovered they were more believable than when supplied their own responses.

Thanks to long-time Slashdot reader Baron_Yam for sharing the article.
This discussion has been archived. No new comments can be posted.

What Happens When You Put 25 ChatGPT-Backed Agents Into an RPG Town?

Comments Filter:
  • A multibillion dollar top tier AI program, with custom code written be a team of the worlds best PhDs, can replicate the behavior of The Sims. Clear the decks everyone, the AI is comin fer err jerbs
  • Is this a trick question?

    • by gweihir ( 88907 )

      This being a trick question would indicate the one asking it is smart. That is obviously not the case. I think it is just a stupid question bys somebody with no insight into the actual state of Artificial Ignorance.

  • Hometown,Hmm can the DC lawyers salivating here.
  • when you give them all guns.

    • by HBI ( 10338492 )
      I was looking for some evidence that they could fuck or kill. None given. I suspect not. Without biological constraints and motivations, the AI behaviors are rather two dimensional.
  • Again, IIUC. I think that the "RPG town" is just a display. I mean they have characters walking into closed stores, and their proposed solution is "We'll post an amendment to the store specification that it's closed then. Not "The character will look at the store and notice that it's closed.". For bathrooms intended for one being occupied by more than one it was "We'll change the name of the place from dorm bathroom to single bathroom".

    So it looks to me like the "RPG town" is just a display of the current system state. I think they've got a lot of the causal links backwards. True, there have got to be feedback loops, but the initial state should be a response to sensing the environment, and that's not what they've got.

    OTOH, they've got basic navigation in "space", and they've got multi-agent coordination. These are things that a pure ChatGPT doesn't have. (However, I'm not sure about that "navigation in space" as the paper didn't give any details. But the diffusion of information was interesting.)

  • So, you are having chat gpt talk with chat gpt. In English. They seems very inefficient. Why doesn't it converge to a more efficient way to dialogue, like talk in machine code, or in God's language (whatever the bible calls it) ?
    • Language is pretty compact and more accessible for debugging. And there is a benefit to a system that can ingest the same format it outputs, more composable.
    • Because gpt was trained on mostly English text. It isn't capable of spawning its own new language onits own.

      Even the valentine's party was spawned by outside input from a human. The gpt didn't recognize that valentine's was coming up soon and decide on its own to have a party.

  • With almost real characters and test advertisement strategies on them.

    Feckle-Freezer!

  • What happens is the anthropomorphized equivalent of watching a POP3 handshake between two servers. Over and over again. No thanks.
    • by gweihir ( 88907 )

      Yep, pretty much. In this case, the whole is much, much less than the sum of its parts.

  • I wonder how long it will be until they start complaining about how an arrow in their knee stopped their aspirations.
  • Seriously, stop attributing powers to this automaton that it does very much not have. Seek your new "God" elsewhere. Or better, not at all. But that would be sophisticated and insightful, cannot have that, now can we.

    • Research into LLM capabilities and limitations is good. We need to know what are these limits.
    • The goal here isn't meaning, but a dynamically generated facade of realism. Which has the potential to offer a much better player experience in RPGs.
  • You get 4chan. What else?
  • They didn't give any of the NPCs the ability to be stressed. Stress is critical component to the behavior of anything living. I'm just saying, if you don't have any NPCs threatening to quit their job or verbally abusing each other then is it even a simulation?

  • Ever played at the table of a "world building" GM? Someone who creates a painstakingly detailed world where every villager in the town has his or her own meaningful life with his or her own meaningful daily chores who went about their merry business, with the player characters just being yet another bunch of characters in the world?

    Where the player characters decide to go ask the miller's wife about something she saw that they need to know to solve the quest they're on, only to be told that she's currently

    • Sounds kind of annoying after the initial experience wears off. Ymmv, but I want to level my characters, get the next weapon up, learn the new spell, blast the shit out of the dragon, loot its lair, and see the final victory cut scene for killing the Mighty Wizard Foozle at the end, mission accomplished, job well done, will buy the sequel in 2-3 years and do the same thing all over again with a nicer game engine.

      • It would require balance. NPCs programmed with more realistic behavior would help with immersing yourself in the story - but it would still need to be severely constrained by the needs of the plot.

        I'm sure someone will figure it out, and you'll appreciate it when they do. It'll win awards.

        • NPCs should give the world the appearance to be organic and vibrant, but they should equally bend to the player's needs. If I want to be ignored by some bozo because he got better things to do than to hear me out and help me, I don't have to sit down at the gaming table, I can call my ISP.

      • It's even annoying during the initial experience annoying.

        Let's assume that we're playing a story-heavy game. With an organic world where the players actually want to experience something and not just get the next +(+1) sword, half the princess and the kingdom... or something like that. Even in such a setting, the players want to be the stars of the show. They don't give a fuck about the miller's son's daily routine (unless it is a plot device that takes them down a dungeon so they can save the day).

        If the

  • The part about cooking breakfast piqued my imagination. It would be nice to have a nice one cook me breakfast, take care of the morning wood, make some coffee and answer questions. Am I being too greedy?
  • Seriously, this sounds like an advanced Sim City on auto-pilot; no need for a human.

    Oddly enough, this sounds kind of cool. It might make for an interesting 'live' art display, zooming in on characters displaying new behavior to see what they doing. The old "miniature world in a snow globe". Just let them go and see what they do, like a study in social evolution.

  • Crank up the speed so that 1 human year occurs every 60 seconds, and see what happens to civilization. They did it with rats.... https://www.smithsonianmag.com... [smithsonianmag.com]
  • The simulation needs a FF button or at least a 2x speed. By 7am I need a 10x speed button.

  • I think it would be awesome if the scientists in this town would get together and talk, and then they make some CAD files and start building stuff. If you sped up the world at like 1000x speed, maybe they'd actually make something useful!
  • Does anyone else see a contradiction in these statements?

    To pull this off, the researchers... created an architecture that simulates minds with memories and experiences...

    the researchers list three emergent behaviors resulting from the simulation. None of these were pre-programmed...

    These included "relationship memory" (memory of past interactions between agents and mentioning those earlier events later)...

    So they added code to "simulate memories and experiences", and then claim it's not pre-programmed when the agents show "memory of past interactions"?

    • The memories aren't the spontaneous things, it's what the agents are doing with the memories.

      I mean, your computer knows you visited the hub two weeks ago, but it's not casually mentioning that it thought the blonde was pretty hot.

  • This is great news! Finally we've got bots who can do all that tedious game playing for us, freeing up our time to go play outside or read a book or whatever. Yay, progress!

As of next Thursday, UNIX will be flushed in favor of TOPS-10. Please update your programs.

Working...