Basically every time AI tries to create its own thing, it's incrementally shittier than whatever it trained on. As more and more AI - produced content floods the internet, it's increasingly training on AI - generated material. The effect is analogous to scanning and printing the same document over and over again, where it ultimately becomes a blurry mess. AI cannot create on its own, it can only modify pre-existing human work.

The article's main solution is to keep some kind of master backup of work labelled as existing before the rise of LLMs, but isn't optimistic of this actually happening. I'm wondering if in a few years the "write TV script" button on chatGPT generates completely unworkable garbage, will studios stop trying to pretend it's a viable replacement for writing staff?

  • VernetheJules [they/them]
    ·
    1 year ago

    The researchers conclude that in a future filled with gen AI tools and their content, human-created content will be even more valuable than it is today — if only as a source of pristine training data for AI.

    Nonsense, we have a bright future ahead of us! As

    :soypoint-1: CONTENT CREATORS

    • LeninsBeard [he/him]
      ·
      1 year ago

      Hooking up every living human to the matrix so I can AI generate a New Yorker article about how my trip to Colombia was subpar because the locals didn't bow to me.