This is half a decade old news, but I only found this out myself after it accidentally came up in conversation at the DMV. The worker would not have informed me if it hadn't come into conversation. Every DMV photo in the United States is being used for AI facial recognition, and nobody has talked about it for years. This is especially concerning given that citizens are recently being required to update their ID to a "Real ID," which means more people than ever before are giving away the rights to their own face.

The biggest problem with privacy issues is that people talk about it for a while, but more often than not nothing ever happens to fix the problem, it simply gets forgotten. For example, in the next few years Copilot will simply become a part of people's lives, and people will slowly stop talking about the privacy implications. What can we even do to fight the privacy practices of giants?

  • helenslunch@feddit.nl
    ·
    4 months ago

    What can we even do to fight the privacy practices of giants?

    Not much unless you're a billionaire or a politician.

  • davel [he/him]@lemmy.ml
    ·
    4 months ago

    This is especially concerning given that citizens […]

    Not everyone with a US driver’s license is a US citizen.

    • Charger8232@lemmy.ml
      hexagon
      ·
      4 months ago

      Correct, however this issue primary affects US citizens, given that driver's licenses aren't the only ID the DMV takes pictures for (e.g. the aforementioned Real ID)

  • krolden@lemmy.ml
    ·
    4 months ago

    I see no issue with the government using photo ID pictures this way, just as long as they aren't using third parties to handle the technical aspect of it or allowing any of the data to be handled by any third parties (eg private corps). They would be stupid to ignore that large amount of known good data they could train their facial rec models on. Yes it sounds big and evil but that's the world we live in as long as this technology exists and you want to participate in society, I guess.

    They're collecting the data already, it's being used this way already by everyone else, so why not?

    • Charger8232@lemmy.ml
      hexagon
      ·
      4 months ago

      Many people's threat models, like my own, are against mass surveillance. This falls under that category, even if it's being handled responsibly.

      • krolden@lemmy.ml
        ·
        edit-2
        4 months ago

        You can be against it all you want but that doesn't mean it's going to matter IRL. The state of the world is that anyone with a large amount of data like this is using it to build models so they can profit and/or enforce. Even if they say they're not doing it, they're still doing it. Or someone with access to that data is doing it.

        Crying about the feds/DMV doing facial rec training is low hanging fruit. Obviously they're going to be doing it along with every other government on the planet with the resources to do it. TBH there's nothing inherently malicious about it, since them having the data they're using is part of you having citizenship/identification in that country. The real malicious ones are the corporations contracted by said government to do the exact same thing except they're doing their own data collection through huge networks of privately owned security cameras.

        The only way to avoid this is to go live in the woods and never come out. Any show of transparency or opting out of any of this would just be theater for you. It's being done, has been done, and will be done without your consent or knowledge.

        • Charger8232@lemmy.ml
          hexagon
          ·
          4 months ago

          Just because mass surveillance is already happening doesn't mean we should accept it as our only option. While it's true that governments and corporations are collecting data on us, there is still merit in pushing back against these practices. The point of privacy is not to hide everything and live in the woods, the point of privacy is to have control over what data you share, when you share it, and with whom you share it with. The problem isn't facial recognition itself, the problem is living in the woods shouldn't be the only way to avoid it. We should be able to opt out. What may seem fine to you is not always fine with others. That's why threat models exist, after all.