Before we start, let's just get the basics out of the way - yes, stealing the work of hundreds of thousands if not millions of private artists without their knowledge or consent and using it to drive them out of business is wrong. Capitalism, as it turns out, is bad. Shocking news to all of you liberals, I'm sure, but it's easy to call foul now because everything is wrong at once - the artists are losing their jobs, the slop being used to muscle them out is soulless and ugly, and the money is going to lazy, talentless hacks instead. With the recent implosion of the NFT space, we're still actively witnessing the swan song of the previous art-adjacent grift, so it's easy to be looking for problems (and there are many problems). But what if things were different?

Just to put my cards on the table, I've been pretty firmly against generative AI for a while, but I'm certainly not opposed to using AI or Machine Learning on any fundamental level. For many menial tasks like Optical Character Recognition and audio transcription, AI algorithms have become indispensable! Tasks like these are grunt work, and by no means is humanity worse off for finding ways to automate them. We can talk about the economic consequences or the quality of the results, sure, but there's no fundamental reason this kind of work can't be performed with Machine Learning.

AI art feels... different. Even ignoring where companies like OpenAI get their training data, there are a lot of reasons AI art makes people like me uneasy. Some of them are admittedly superficial, like the strange proportions or extra fingers, but there's more to it than that.

The problem for me is baked into the very premise - making an AI to do our art only makes sense if art is just another task, just work that needs to be done. If sourcing images is just a matter of finding more grist for the mill, AI is a dream come true! That may sound a little harsh, and it is, but it's true. Generative AI isn't really art - art is supposed to express something, or mean something, or do something, and Generative AI is fundamentally incapable of functioning on this wavelength. All the AI works with is images - there's no understanding of ideas like time, culture, or emotion. The entirety of the human experience is fundamentally inaccessible to generative AI simply because experience itself is inaccessible to it. An AI model can never go on a walk, or mow a lawn, or taste an apple, it's just an image generator. Nothing it draws for us can ever really mean anything to us, because it isn't one of us. Often times, I hear people talk about this kind of stuff almost like it's just a technical issue, as if once they're done rooting out the racial bias or blocking off the deepfake porn, then they'll finally have some time to patch in a soul. When artist Jens Haaning mailed in 2 blank canvases titled "Take the Money and Run" to the Kunsten Museum of Modern Art, it was a divisive commentary on human greed, the nature of labor, and the nonsequitir pricing endemic to modern art. The knowledge that a real person at that museum opened the box, saw a big blank sheet, and had to stick it up on the wall, the fact that there was a real person on the other side of that transaction who did what they did and got away with it, the story around its creation, that is the art. If StableDiffusion gave someone a blank output, it'd be reported as a bug and patched within the week.

All that said, is AI image generation fundamentally wrong? Sure, the people trying to make money off of it are definitely skeevy, but is there some moral problem with creating a bunch of dumb, meaningless junk images for fun? Do we get to cancel Neil Cicierega because he wanted to know how Talking Heads frontman David Byrne might look directing traffic in his oversized suit?

Maybe just a teensy bit, at least under the current circumstances.

I'll probably end up writing a part 2 about my thoughts on stuff like data harvesting and stuff, not sure yet. I feel especially strongly about the whole "AI is just another tool" discourse when people are talking about using these big models, so don't even get me started on that.

  • JohnBrownNote [comrade/them, des/pair]
    ·
    7 months ago

    Before we start, let's just get the basics out of the way - yes, stealing

    it's not stealing, and it's not intelligence. we can't have a conversation about this or even think about it properly if we're using capitalist ideas about ownership and advertising copy definitions of terms. obviously the corporations are in the wrong, but it's apparently pretty easy to spin up your own model on a GPU that a lot of people own so there's more here to unpack than the surface level luddist anticapitalism.

    I'll just say i see a lot of idealism in here and doing something with a computer doesn't make it different from doing it by manually- like if i shredded a bunch of prints and then made an unrelated mosaic with the pieces where you could still identify part of mona lisa's eye.

    • OutrageousHairdo [he/him]
      hexagon
      ·
      edit-2
      7 months ago

      This so colossally misunderstands every fucking point I made, nothing you said here is correct. Just to enumerate:

      • Yeah, it is fucking theft if it happens under capitalism. The whole first paragraph was getting present day conditions out of the way before I indulged in hypotheticals. Pretending like it's "wrong" to try and hold ownership over your own work is just a total non-sequitir if we're discussing present day capitalist conditions, for reasons I should hope are completely obvious to any Hexbear native.
      • I'm using the industry-accepted term that everyone already knows and uses, I'm not going to weigh down my writing with a bunch of air quotes. My whole fucking point with this is that it's not intelligent.
      • It's still wrong if you as an individual make your own model instead of using the corporate AI. If you're using the same training sets full of internet art to churn out AI art, or a pre-trained model which makes use of those, I'm still going to look down on you. I know we're all pro-piracy here, and I am too, but it's different when you do it to normal people.
      • It sounds like you're being deliberately obtuse here. The whole reason why AI sucks at art is that it represents the deliberate lack of choice, the absence of human intervention. The only reason you should be asking AI to make something for you is if you genuinely don't give enough of a shit to shape that part of your work yourself. Like, sure, if you didn't want to do a bunch of sand and dirt textures for your video game, go ahead I guess, nobody's out there pouring their soul out by deciding the exact arrangement of rocks on the floor, but for anything else it just feels counterproductive. The vast majority of my friends are digital artists, I certainly have no issue with computers, it just sounds like you're trying to avoid engaging with my points. The only way to make AI art into real art is to add the humanity back into it, and I don't see many artists doing that sort of thing. Most so-called "AI artists" I've seen just retry until they get an image they like, take the one they liked most, feed it back in with some stuff they wanna change highlighted purple, maybe photoshop out the shitty hands if they're really going the distance, and then they're done. I get that the definition of art is subjective and all, but it just isn't enough to clear the bar for me when the AI is making like 90% of the decisions and doing 95% of the work, and most people are not putting in that level of effort to begin with.
      • JohnBrownNote [comrade/them, des/pair]
        ·
        7 months ago

        not gonna rehash things we fundamentally disagree on a third time

        It sounds like you're being deliberately obtuse here. The whole reason why AI sucks at art is that it represents the deliberate lack of choice, the absence of human intervention.

        i dunno, man "the deliberate lack of choice, the absence of human intervention" sounds pretty artistic to me, looking around at the kind of meta slop we consider to be art.