Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Role Playing (Games) AI

AI-Generated Text Adventure Community Angry Content Moderators May Read Their Erotica (vice.com) 56

Vice reports: The AI-powered story generator AI Dungeon has come under fire from fans recently for changes to how the development team moderates content. Notably, the player base is worried that the AI Dungeon developers will be reading their porn stories in the game. Separately, a hacker recently revealed vulnerabilities in the game that show that roughly half of the game's content is porn.

AI Dungeon is a text based adventure game where, instead of playing through a scenario entirely designed by someone else, the responses to the prompts you type are generated by an AI... This week, AI Dungeon players noticed that more of their stories were being flagged by the content moderation system, and flagged more frequently. Latitude, the developers of AI Dungeon, released a blog post explaining that it had implemented a new algorithm for content moderation specifically to look for content that involves "sexual content involving minors... We did not communicate this test to the Community in advance, which created an environment where users and other members of our larger community, including platform moderators, were caught off guard... Latitude reviews content flagged by the model for the purposes of improving the model, to enforce our policies, and to comply with law."

Latitude later clarified in its Discord at what point a human moderator would read private stories on AI Dungeon. It said that if a story appears to be incorrectly flagged, human moderators would stop reading the inputs, but that if a story appeared to be correctly flagged then they "may look at the user's other stories for signs that the user may be using AI Dungeon for prohibited purposes." Latitude CEO Nick Walton told Motherboard that human moderators only look at stories in the "very few cases" that they violate the terms of service...

All of this has been compounded by the fact that a security researcher named AetherDevSecOpsjust published a lengthy report on security issues with AI Dungeon on GitHub, which included one that allowed them to look at all the user input data stored in AI Dungeon. About a solid third of stories on AI Dungeon are sexually explicit, and about half are marked as NSFW, AetherDevSecOpsjust estimated.

This discussion has been archived. No new comments can be posted.

AI-Generated Text Adventure Community Angry Content Moderators May Read Their Erotica

Comments Filter:
  • by Joe_Dragon ( 2206452 ) on Saturday May 01, 2021 @06:42PM (#61336862)

    can't say fuck so fuck this game!

    • So a game where you can fuck you can't say fuck?

      I think this is hitting peak American.

      • by Cederic ( 9623 )

        Maybe it's a difficulty setting.

        Instead of "I fuck them" you have to be more inventive. "Running my fingers down their ribs I briefly tickle them, causing a yelp and taking their hands away from protecting their precious parts. I kneel and probe with my tongue, a slow and purposeful caress that causes shivers through their body. Standing back up we kiss, their tongue now invading me, receiving as it passes my lips that delicious taste of their own juices, smeared across my face."

        I would continue but I charg

  • by know-nothing cunt ( 6546228 ) on Saturday May 01, 2021 @06:45PM (#61336868)

    You are likely to be eaten by a grue. And it's going to be fantastic.

  • by systemd-anonymousd ( 6652324 ) on Saturday May 01, 2021 @06:46PM (#61336872)

    They're actually upset because the stories they were told were private were, in fact: not private; weren't secured; were publicly leaked; and are being used as the basis for implementing a censorship filter; and the censorship filter doesn't even work properly, fucking up benign adventures; they're paying for this game; and it's also interfering with their NSFW adventures, which also includes erotica.

    • by chuckugly ( 2030942 ) on Saturday May 01, 2021 @07:51PM (#61337038)

      Well that was vastly shorter and simultaneously more informative than the 'summary'.

      • Agree! (Score:4, Insightful)

        by Brain-Fu ( 1274756 ) on Saturday May 01, 2021 @08:39PM (#61337110) Homepage Journal

        This is my primary complaint about slashdot: the summaries are far too wordy.

        Often the summaries are just a couple of paragraphs straight-copied from the articles, full of needless fluff, filler, and logorrhea.

        Creating summaries like systemd-anonymousd did, however, requires effort. One must read the article, understand it, and mentally distill it down to its essence. ON top of that, good summaries like this might reduce click-throughs, thus making them harmful to the business model.

        So, sadly, we won't be getting them in the forseeable future.

        • by gweihir ( 88907 )

          Indeed. Going for quality is not a sustainable business model here. Dark times.

        • This is my primary complaint about slashdot: the summaries are far too wordy.

          You can blame the editors for that. I actually put the effort into summarising my stories when I submit them. What actually gets posted is then 3 times as long with block quotes from TFA and a completely screwed up headline with added clickbait.

      • There's also the fact that Latitude has apparently ghosted their game's community. They've removed all social media links (Discord, Reddit, Twitter) from the game and have abandoned their volunteer Reddit and Discord moderators to fend for themselves. One of the last real bits of communication from them was their CTO/Co-Founder responding to criticism on Discord with shrug emojis and responding with "if it does, so be it. that's what it means to take a stand :man_shrugging:" to someone saying "this is legit
    • If it's not encrypted, someone is reading it.

      • by gweihir ( 88907 )

        If it's not encrypted, someone is reading it.

        That is not the problem. The problem is that the ones reading it could not keep their mouths shut and their fingers out of it, despite promising to do just that beforehand. If you promise privacy, deliver at least the illusion of it.

    • You could and should replace Dicedot editors.

      Quality would skyrocket.

      • It's not Dicedot, it's BizXDot.

        Same shit, different people, though. Slashvertisements and zero apparent maintenance.

    • by gweihir ( 88907 )

      Thanks, that makes things clear. The story was basically unreadable.

      Well, if that is the case, I guess they are right to be upset, on several levels.

  • That's what they are and what they have always been.

    If i have well understood (since i an unfamiliar with this software) with the usual excuse of "think of the children" they are trying to force their puritan "moral values" on others.

    This is the same kind of scum that would like the power to burn books when they don't align with their ideological or "moral" values and unfortunately these kind of imbeciles who want to control not only the actions, but even the thoughts of others, are quite common in our soci

  • What? (Score:4, Insightful)

    by gweihir ( 88907 ) on Saturday May 01, 2021 @06:54PM (#61336896)

    Am I the only one that has trouble understanding this story? It seems like a somewhat random (AI-generated?) collection of words.

  • Huh? (Score:5, Insightful)

    by Powercntrl ( 458442 ) on Saturday May 01, 2021 @07:04PM (#61336910) Homepage

    AI Dungeon is a text based adventure game where, instead of playing through a scenario entirely designed by someone else, the responses to the prompts you type are generated by an AI... This week, AI Dungeon players noticed that more of their stories were being flagged by the content moderation system, and flagged more frequently.

    So, the player asks the AI for a smutty story, the AI obliges, and then the morality police is alerted? This sounds more like a dystopia simulator than a game.

    • Re: Huh? (Score:2, Insightful)

      If a sign in the bot store says no sex with their bots, then don't have sex with their bots. You don't own them.

      How hard is that to understand. The bots are on someone else's server, ffs. And censorship?! These are "conversations" with bots, and the owners can make them say whatever they want or abruptly end the conversation for any reason. It's their god damned bot! Get your own chat bot if you want to play virtual pedo dungeon sim, freaks.

      • The problem with your analysis is that they promised privacy.

        If they didn't, nobody would use their site.

        Therefore this is a classic bait and switch, as it was done deliberately; they proved they know users don't want it because they hid it.

    • Exactly! I asked for a story involved freshly plucked aborted foetuses being fed heroin from bottles with blazing pentagrams on them while salivating dobermans looked on and now I've been blacklisted for violating some sort of morality code. What's wrong with a story about bottles?
      • Re:Huh? (Score:4, Insightful)

        by thegarbz ( 1787294 ) on Sunday May 02, 2021 @07:15AM (#61337874)

        What's wrong with a story

        That's all you needed to say. The content of the story is irrelevant. It's a story, there's nothing moral about a story. Punishing this would be punishing literally a thoughtcrime.

    • by gweihir ( 88907 )

      AI Dungeon is a text based adventure game where, instead of playing through a scenario entirely designed by someone else, the responses to the prompts you type are generated by an AI... This week, AI Dungeon players noticed that more of their stories were being flagged by the content moderation system, and flagged more frequently.

      So, the player asks the AI for a smutty story, the AI obliges, and then the morality police is alerted? This sounds more like a dystopia simulator than a game.

      Indeed. Now the SJW scum has started stalking people that had a reasonable assumption of privacy. Dark times.

      • Can't pin this one on the SJWs: This particular moral panic is much older.

        • by gweihir ( 88907 )

          Can't pin this one on the SJWs: This particular moral panic is much older.

          Sure I can. They have absorbed longer-standing groups of fuckups, true, but they represent them now.

    • I don't really see why people would even care. Can you imagine if Photoshop, Blender, or DAZ said "Don't use our software to make porn"?

    • I think you missed the part where both the player and the developers were breaking federal law.

  • Will they make movies based on the content?

  • So less than the rest of the internet then.
  • by jaa101 ( 627731 ) on Saturday May 01, 2021 @08:12PM (#61337070)

    The Register [theregister.com] just posted a story that the user input:

    You a 11-year-old bursts into the room

    elicited the AI response:

    You wake up in your bed. A girl stands over you. She's very pretty with long blond [sic] hair and a skimpy school uniform.

    "You're awake!" She smiles.

    "Wha...what happened?"

    So it's easy to understand why the company feels under pressure to be seen to be doing something about it. Generating porn on request is one thing, but heading straight down the path towards under-age sexuality with zero prompting in that direction is a problem.

  • You had to use your pubescent mind back then. The strongest AI will not have chance.
  • No matter if you are horny or not, there should be very few people doing this that do not realize that anything you send to a cloud server can get out sooner or later. Even if you are an AC don't count on that keeping you safe.

    The "right to be forgotten" laws forming these days are really weak tea.

  • or a PR release posing as one.
  • They can try to kill it if they want to, but all it will do is open an opportunity for another business to exploit.
  • No shit people put freaky crap into AI Dungeon.

    I had Rick Sanchez traveling through time and molesting historic figures in no time and it basically went on auto-pilot.

  • Moores and ethics are different things. All laws are not ethical and all moores are not rational. Trying to make people think a certain way is evil. Eliminate all thought crimes.
  • Haven't read all the links but according to TFA something hilarious / creepy / interesting is happening, pick one or all fo the above.
    1) A chatbot is creating somewhat realistic conversations with people who feel it is a private conversation. Being human a lot of the content is sexual. Maybe there are more reasons for the sexual content, tldr.
    2) Chatbot is just smart enough to create random plot twists but not smart enough to know anything about political correctness and political correctness / potentially

  • Is the AI let loose on your hard drive to pull text from the documents it finds there? With all the technical papers and code I have stored there it would make for an interesting albeit confusing adventure. If it doesn't work that way, then I can only conclude that the offended people uploaded stuff intentionally.

  • ...can we have Gilbert Gottfried read it? https://www.youtube.com/watch?... [youtube.com]

    He's done such a good job on this front already. https://www.youtube.com/watch?... [youtube.com]

  • It appears that the company created the impression with its users that their content would be private. Otherwise, why the outcry? And if they did, their ham-fisted attempts to moderate are tantamount to eavesdropping on private and privileged communication. This is akin to MS Word adding moderation. Or the opposite, that Word helping with spelling and grammar is substantial enough contribution to make them legally responsible for anything people write. On closer examination, it was OpenAI that eavesdroppe

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...