• SerLava [he/him]
    ·
    edit-2
    2 years ago

    In anything past the short-term, deepfakes are absolutely NOT going to just trick people into believing whatever they see on a video or photo. Here's what will actually happen:

    • People will watch realistic videos of Abraham Lincoln discussing anime with Barack Obama and then transforming into dogs Animorphs style
    • People just won't believe photographic or video evidence
    • There will be some kind of video capture system that somehow authenticates individual frames as it's shot, and then gets the video registered in a centralized server run by a tech giant
    • People won't look that up and will just not believe the authenticated video
    • Devices will start to react to non-authenticated video and flag them proactively
    • This will be sufficient for normal people to trust videos again
    • The right wing will say the system is a lie set up by the deep state and ignore it
    • They'll mostly use this attitude to claim real videos are fake, not the other way around.
    • Once video forensics can't reliably detect deepfakes and we're reliant on whatever encrypted authentication method, the CIA will actually gag order Apple or whatever and sneak in some real deepfake psyops and get found out 25 years later
    • JuneFall [none/use name]
      ·
      2 years ago

      My parents already believe stuff if it is in the newspaper, so I don't share your beginning of reasoning.

      The rest sounds possible, albeit some bullet points are plausible.

      They’ll mostly use this attitude to claim real videos are fake, not the other way around.

      For example, but they will also gladly accept fakes, as they do now.

      • SerLava [he/him]
        ·
        2 years ago
        • Oh also the right wingers will come out with like a very simplistic watermark in the corner of videos, and they'll collectively believe it's put there by a competing authentication platform, only patriotic.