• bncvdg [none/use name]
    ·
    4 years ago

    The ban wouldn't get rid of it. It would just end up effectively restricting development of the technology to intelligence agencies and corporations instead of it floating around in the public. The activist are not forcing lawmakers to care(lol), the ruling class recognizes that there is an advantage to establishing a monopoly on fakes in a medium everyone still views as legitimate.

    • TossedAccount [he/him]
      ·
      4 years ago

      the ruling class recognizes that there is an advantage to establishing a monopoly on fakes in a medium everyone still views as legitimate.

      This is what makes me suspect that some of Trump's and Biden's more sterile, polished speeches might have been deepfaked, if speechwriters and teleprompters didn't have at least as much explanatory power.

      • SerLava [he/him]
        ·
        4 years ago

        Nah you can still tell at this point

        Once you can't tell, I suspect people will use special internet connected cameras that validate frames with a secure server or something, so that you can check for authenticity. CHUDs will simply say the authentication service is a liberal hoax, but we will all be able to tell.

        • warped_fungus [she/her]
          ·
          4 years ago

          How do we know we haven't passed the line of being unable to tell? What if poorly made ones are made on purpose to shake off suspicion that the tech is improving faster than we know 🤔

          • SerLava [he/him]
            ·
            4 years ago

            Well, then we will have fake videos until the public version of deepfake gets to that level, and is used to demonstrate impossible stuff undetectably. Then the magic cameras will become widespread.

      • LoMeinTenants [any]
        ·
        4 years ago

        No. To make a statement like that right now feels like Trump apologia.

        There's definitely potential for it, but why would they consider it when these politicians are functioning exactly as intended?

        • TossedAccount [he/him]
          ·
          4 years ago

          The argument for Trump having been deepfaked is precisely that they would have needed it to pacify his base because he went rogue by continuing to encourage the January 6 gathering right up until shit got real and Twitter banned his ass.

      • jabrd [he/him]
        ·
        4 years ago

        Even if the technology was there for it (which it very well could be, the govt is usually 10 years ahead of the public technology wise), there just doesn't seem like a big enough payoff to risk getting exposed just to have one of these senile fucks not trip over their own words during a briefing. Maybe the CIA has some deepfakes put aside for rainy days of national leaders stepping down to ease up their coups, but I doubt anything's being rolled out for that sort of thing

  • MarxistHedonism [she/her]
    ·
    4 years ago

    Photoshopped picture of Nancy Pelosi in lingerie: 25 likes, 31 comments

    This post: 14 likes, 0 comments

    Obviously Pelosi is far more privileged and shielded from consequences, unlike the women in this article, but it’s still gross.

  • comi [he/him]
    ·
    edit-2
    4 years ago

    I have a question though, conceivably if deepfake is done frequently and the knowledge is widespread, would it not shield everyone like “oh that’s not me”? On the other hand it can be done secretly with non-public photos, so that would suck:(

    Edit: but now that I’m thinking about it, it could be bonkers opportunity for gaslighting and changing peoples memories, even outside of porn. So 🤢

      • comi [he/him]
        ·
        edit-2
        4 years ago
        spoiler

        Yeah, you’re right it’s violating, I just don’t know (can’t conceive, don’t have experience) if the violation is betrayal of intimacy/trust or precisely the images themselves. Like to me if someone photoshopped my head on some compromising photo, I would not be super bothered, but if someone divulged my secret or real photo in same position, it would much more damaging for trust issues.

        Edit: dummy forgot about SA angle

          • comi [he/him]
            ·
            edit-2
            4 years ago
            subject to self-crit

            I’m putting a littler bit debate-bro here, so apologies. But people in memes for example also has not chosen to be seen by billions (like Nirvana kid or others), and aversion to sexual position comes from society puritanical bend, so ideally all of this (non-public figures images, photoshops of them etc) should be illegal or questionable (?). I agree that in the world we live right now it’s violating, I’m just musing here, sorry

  • UglySpaghettiHoe [he/him]
    ·
    4 years ago

    "this violates my freedum of speech somehow" But seriously this shit is beyond creepy

  • KrasMazovThought [comrade/them]
    ·
    4 years ago

    It will be restored to its true purpose of blackmailing oligarchs and inserting Chris Pratt into the Sound of Music

    • TossedAccount [he/him]
      ·
      4 years ago

      Unfortunately I don't even think that would work on the people we'd want it to work on.

      Oligarchs are so powerful that this shit wouldn't work on them; Trump for example was so scandal-proof that no deepfake would have hurt his credibility among people who didn't already hate him. Meanwhile the oligarchs don't even need to use deepfakes to discredit left-wing figures on the flimsiest basis (e.g. the Corbyn antisemitism smear, Operation Carwash).

      • SerLava [he/him]
        ·
        4 years ago

        Trump for example was so scandal-proof that no deepfake would have hurt his credibility among people who didn’t already hate him

        That's not true. You're not thinking creatively enough- they could deepfake him being woke or anti white or gay or weak

        • KrasMazovThought [comrade/them]
          ·
          4 years ago

          Rachel Maddow furiously deepfaking Trump and Putin into a gay sex scene to the (non-sexual) salivation of liberals everywhere