I'm gonna level with you guys, there's actually, seriously no way to tell if some of these are AI generated or not. Telltale signs are not if people are ugly or not (actually styleGAN mostly generates pretty faces), the stuff it gets wrong is documented stuff like the spaghetti-ish backgrounds, facial accesories, splotchy hair, repeated teeth. I'm not talking about whether it's plausible that Zenz padded the database with a bunch of deepfakes or not, just that for a set of images of faces with uniform backgrounds, no visible facial accessories, hair pulled back, no teeth shown, it's pretty fucking hard to tell if these pictures are real or not. I don't know if it matters because Adrian Zenz's mugshots have little context to them to begin with, but we got here, we finally don't know what's real or not.
For what is worth though some of the pictures do have revealing background details, like a note pasted on the back that shows on some pictures, which a GAN focused solely on faces shouldn't be capable of rendering that tells me some of these pictures are the real deal, but again, for a lot of these,there's no meaningful difference between a real mugshot and a deepfake mugshot. It's a chilling thought.
Yeah, I think people are doing caliper shit partially because they're used to seeing AI stuff from 10 years ago making the rounds on popular media. I'm not saying that some of them can't be real or can't closely approximate the source image of the training data which would have to look something like this, BUT I don't think AI generated people are going to have a big sign on them that says "I'm fake" anymore like they used to.
In the next few years it's going to become increasingly concerning how much the line between real and unreal will blur in cyberspace.
I'm gonna level with you guys, there's actually, seriously no way to tell if some of these are AI generated or not. Telltale signs are not if people are ugly or not (actually styleGAN mostly generates pretty faces), the stuff it gets wrong is documented stuff like the spaghetti-ish backgrounds, facial accesories, splotchy hair, repeated teeth. I'm not talking about whether it's plausible that Zenz padded the database with a bunch of deepfakes or not, just that for a set of images of faces with uniform backgrounds, no visible facial accessories, hair pulled back, no teeth shown, it's pretty fucking hard to tell if these pictures are real or not. I don't know if it matters because Adrian Zenz's mugshots have little context to them to begin with, but we got here, we finally don't know what's real or not.
For what is worth though some of the pictures do have revealing background details, like a note pasted on the back that shows on some pictures, which a GAN focused solely on faces shouldn't be capable of rendering that tells me some of these pictures are the real deal, but again, for a lot of these,there's no meaningful difference between a real mugshot and a deepfake mugshot. It's a chilling thought.
Yeah, I think people are doing caliper shit partially because they're used to seeing AI stuff from 10 years ago making the rounds on popular media. I'm not saying that some of them can't be real or can't closely approximate the source image of the training data which would have to look something like this, BUT I don't think AI generated people are going to have a big sign on them that says "I'm fake" anymore like they used to.
In the next few years it's going to become increasingly concerning how much the line between real and unreal will blur in cyberspace.