• KittyBobo [he/him, comrade/them]
    ·
    11 months ago

    I mean without AI they'd just be using bad photoshop. Heck if you got someone who was good at photoshop and could make realistic propaganda that'd be worse than AI that can easily be picked apart.

    • regul [any]
      ·
      11 months ago

      Honestly this picture looks more like bad photoshop than AI.

    • buckykat [none/use name]
      ·
      11 months ago

      It looks like the word PRESS is still badly photoshopped in here, probably because the AI image generators still suck at generating text

      • DamarcusArt@lemmygrad.ml
        ·
        11 months ago

        They didn't even bother to rotate and angle it properly, it's like they just copy-pasted the text and called it a day 😂

    • viva_la_juche [they/them, any]
      ·
      11 months ago

      So what you’re saying is we need to shut down computers until we can figure out what’s going on

    • gramathy@lemmy.ml
      ·
      11 months ago

      I wonder if the forensic techniques to identify photoshopped images /altered audio work on ai-generated media?

      I know you can timestamp audio to a specific point in time by matching the frequency of the background electrical hum if it’s available, so if it should be available but isn’t that could indicate a video or audio clip is fake, and that differences in image grain/quality can identify patchwork images, but does that also come out in AI-generated images?