Pedos ruin everything...
Really easy to see where this is going.
"open source image synthesis technologies such as Stable Diffusion allow the creation of AI-generated pornography with ease, and a large community has formed around tools and add-ons that enhance this ability. Since these AI models are openly available and often run locally, there are sometimes no guardrails preventing someone from creating sexualized images of children, and that has rung alarm bells among the nation's top prosecutors. (It's worth noting that Midjourney, DALL-E, and Adobe Firefly all have built-in filters that bar the creation of pornographic content.)"
Paid software that can be reined it so it doesn't compete with Netflix and disney is fine, the open source stuff is satan spawn.
The easy solution would be to go after the ones that distribute the pictures, this is only about keeping the gravy train going.
I agree with this, but don't have much hope of anything passing. They didn't outlaw underage hentai, so I feel like this is an uphill battle they'll give up on.
What a terrible thing to try to unravel. And something we as a society should be very focused on solving.
Obviously there is little someone can do to prevent it since it can run locally. Making the exchange of CSAM illegal is easier.
I just hope the AI stuff reduces the exploitation of real children.
Then maybe we can focus on therapy for these sick minded fucks that create and consume CSAM.
I thought the same, but the article puts it into perspective - first training anything requires having content on hand, and that means children were exploited to get it. Second, by enabling and allowing it it allows only deeper desires, possibly pushing people even further. Definitely above our paygrades here to determine if that's good or bad. The letter calls out that by allowing it we as society are normalizing it, saying it's okay, and we definitely do not want it normalized.
Unfortunately, this does probably mean the end of the wild west of AI generation. I don't think this will stop tools from being created, it sounds like (even though Ars somehow demonized the term 'open source') tools like SD and training are going to stick around. (Probably because they know they'd just be forked and continued on later). But I think we're going to see a lot more regulation on sharing models. Right now there are a few obvious sites with things being shared that even I've been shocked are allowed, I think we're going to see a lot more rules on what can and can't be uploaded.
Even as the tech gets easier, fine tuning and training models is a much more involved process that most people won't want to go through, and so stopping the sharing of those models will cut down on the vast majority. (Similar to those who share vs consume illegal material already on the internet)
Looks like the first step they're calling for is basically saying even if CSAM was created by AI, it should still be classified legally as CSAM, and so trading it and sharing it means a one way ticket to jail, and I'm okay with that personally.
I work for a hosting provider and recently we received a report about a user hosting AI generated CSAM, I verified it and forwarded it to the legal team. They told him to GTFO.
He left a negative review because we "wouldn't let him host AI generated content". Nuh-uh sir, that is not why. Some people are just so out of touch with reality.