Basically every time AI tries to create its own thing, it's incrementally shittier than whatever it trained on. As more and more AI - produced content floods the internet, it's increasingly training on AI - generated material. The effect is analogous to scanning and printing the same document over and over again, where it ultimately becomes a blurry mess. AI cannot create on its own, it can only modify pre-existing human work.
The article's main solution is to keep some kind of master backup of work labelled as existing before the rise of LLMs, but isn't optimistic of this actually happening. I'm wondering if in a few years the "write TV script" button on chatGPT generates completely unworkable garbage, will studios stop trying to pretend it's a viable replacement for writing staff?
Totally agreed on that point. I love using AI to break writers block because it pulls interesting ideas out that I never would have come up with on my own. Sort of like an electronic writers room.
AI under capitalism sucks though. It feels like the next block chain hype.
How exactly do you use it for that? Is it specific for each situation you're facing, or do you have go-to directions for it?
It depends on the model you're using. I use novel ai and it has a lore book feature that keeps track of world building. Each character, location, historical event has its own entry with tags.
You can also mess with generation parameters to limit certain words, increase/decrease generation randomness, and use modules if you want to stick with a specific author style.
This really helps me avoid writers block because the AI can refer to lore I wrote. Sometimes it brings things up that I forgot about from previous chapters. Or I can ask it for an alternative way to write something when I find that my writing is getting too repetitive.
It still needs a lot of guidance to make an interesting novel, but as an assistant it's great.