https://nitter.1d4.us/TheUnaButters/status/1651778740421287941

  • Dolores [love/loves]
    ·
    2 years ago

    Endless flavor text

    this would so easily compromise worldbuilding and plot unless you've just got an AI imagining 10000000 ways to talk about the weather

    • Nagarjuna [he/him]
      ·
      2 years ago

      If you ask chat gpt to answer things as a Marxist leninist then it keeps things in the ml universe so to speak. I don't see why you couldn't ask "as an elder scrolls character" or "as a dunmer bard" or whatever and get something that's mostly in-universe. TES lore contradicts itself so much just from arena and daggerfall to morrowind that I don't think it could be any worse than the in-universe contradictions.

      • Dolores [love/loves]
        ·
        2 years ago

        specific to Elder Scrolls, you've got a bit of a point with inconsistencies but even if you got it to remember the things it generated in a playthrough or instance, it'd be maddening once you try to take any information and apply it to anything else.

        a dunmer bard telling me the weather in Blacklight is ashy, rainy, snowy, or sunny in separate playthroughs. like at that point i don't care to know. what's the point of learning additional 'lore' if it isn't actually lore?

        • Nagarjuna [he/him]
          ·
          2 years ago

          Hmm, it would take a lot of work shopping. I still think it's a powerful tool and there's definitely an application for open world RPGs

          • ssjmarx [he/him]
            ·
            2 years ago

            I think it could work if you went the other way around. Like you have a huge database of facts about the world, and then the chat algorithm is capable of spitting them out or working them into conversations naturally. You could make an entire game out of this - think of the game Her Story, where you have to search through the interrogation clips to find the information you need, but instead the information you need to solve puzzles in the game world is in the chat algorithm, and you need to talk to the right NPCs and ask the right questions to find it.

      • Frank [he/him, he/him]
        ·
        2 years ago

        Yeah but thinking people tried to tie the elder scrolls lore in to something semi-coherent, generating a lot of the most interesting concepts in the setting along the way. A language model just spits out strings of words with no semantic meaning. The LLMs can spit out marxist sounding gibberish because they're re-mixing the contents of untold millions of lines of text scraped from the internet. The LLM has a vast amount of information from which to cobble together a mockery of human speech.

        For an elder scrolls game you're not going to have that. you're going to have a few hundred pages from the books, and whatever additional information the writing team comes up with. It's going to be far, far more restrictive. The LLM can't make things up, it can only repeat variations of what's in it's model. It has no self awareness, introspection or reflection so at most they'd be able to hack together a very limited way to keep it's responses consistent across multiple NPCs.

        These things are very limited. I know people are having fun reading the text they spit out, but the number of people who still think the LLMs can determine the truth value of the statements it generates or "know" what its talking about is frightening.