PredPol (now renamed Geolitica) built racial profiling and class profiling AI software and then sold it to police departments along with policing and behavioral “recommendations” that included harassment of entire communities.


Markup reporting (includes data visualization): https://themarkup.org/prediction-bias/2021/12/02/crime-prediction-software-promised-to-be-free-of-biases-new-data-shows-it-perpetuates-them

Gizmodo story here: https://gizmodo.com/crime-prediction-software-promised-to-be-free-of-biases-1848138977

Markup article outlining their methodology for how they analyzed this data: https://themarkup.org/show-your-work/2021/12/02/how-we-determined-crime-prediction-software-disproportionately-targeted-low-income-black-and-latino-neighborhoods


This story broke at the end of last year and I somehow missed it.

The founders of Geolitica / PredPol have known their software racially profiled since an independent study was published in 2018. They declined to update their algorithm, and withheld the information from police departments they sold their software to.


Link to independent study showing clear racial and class bias of Geolitica’s PredPol AI: https://ieeexplore.ieee.org/abstract/document/8616417


The author’s of the study even provided Geolitica with potential fixes to the algorithm. The company declined to implement these fixes because the fixes would have decreased the amount of racial profiling recommendations (ie “cRiMe pReDicTiOns”)

Even worse, a number of police departments were still using this software at the time the report broke.

:acab-3: :acab:

There is a ton of information in all of these reports and it’s worth reading through separate from my summary, more high lights in the comments. Incredible journalism from Gizmodo and The Markup.

  • RNAi [he/him]
    ·
    2 years ago

    Are you fucking kidding me this is textbook cartoonishly bad examples of machine learning done wrong and stupidly and applied evily.

    • Jadzia_Dax [she/her]
      hexagon
      M
      ·
      edit-2
      2 years ago

      It’s disgusting and sloppy. The CEO of this company and every pig who purchased or used their software should be in jail or facing the wall

  • Jadzia_Dax [she/her]
    hexagon
    M
    ·
    edit-2
    2 years ago

    Holy shit the LAPD just had an insecure link to an open cloud storage drive filled with all the data for the LAPD and dozens of other departments.

    This means that PredPol (now renamed Geolitica), was not segmenting user data between regions and departments.

    Reports "Found on the Internet"

    We found the crime predictions for our analysis through a link on the Los Angeles Police Department’s public website, which led to an open cloud storage bucket containing PredPol predictions for not just the LAPD but also dozens of other departments. When we downloaded the data on Jan. 31, 2021, it held 7.4 million predictions dating back to Feb. 15, 2018. Public access to that page is now blocked.

    PredPol has now renamed itself Geolitica. When asked for comment, the CEO tried to get the reporters to use a different, pre-curated, data set instead. When the journalists declined, he ghosted them lol

    PredPol, which renamed itself Geolitica in March, criticized our analysis as based on reports “found on the internet.” But the company’s CEO did not dispute the authenticity of the prediction reports, which we provided, acknowledging that they “appeared to be generated by PredPol.”

    We explained that we had already discovered date discrepancies for exactly 20 departments and were not using that data in our final analysis, and volunteered to share the analysis dates with him for confirmation. He instead offered to allow us to use the software for free on publicly available crime data instead of reporting on the data we had gathered. After we declined, he did not respond to further emails

  • Jadzia_Dax [she/her]
    hexagon
    M
    ·
    edit-2
    2 years ago

    Imagine building something like this, except you trained the model to instead detect when a private business was more likely to commit wage theft, tax evasion, or labor violations.

    Could have fun inputs like pulling the schools and former companies of every C-level executive of the company you’re analyzing off of LinkedIn.

  • buh [she/her]
    ·
    2 years ago

    “PredPol” sounds like something from Zootopia

  • Shoegazer [he/him]
    ·
    2 years ago

    Libs will still call you overreacting when you say all cops deserve the bullet

  • AcidSmiley [she/her]
    ·
    2 years ago

    One of the first, and reportedly most widely used, is PredPol, its name an amalgamation of the words “predictive policing.”

    more like an amalgamation of predatory :le-pol-face:

  • Jadzia_Dax [she/her]
    hexagon
    M
    ·
    2 years ago

    Between 2018 and 2021, more than one in 33 U.S. residents were potentially subject to police patrol decisions directed by crime prediction software called PredPol.

    “I would liken it to policing bias-by-proxy,” Elgin Police Department deputy chief Adam Schuessler said in an interview. The department has stopped using the software.

    Surprising a pig took called that out and even stopped using the software.

    • Jadzia_Dax [she/her]
      hexagon
      M
      ·
      edit-2
      2 years ago

      Police aren’t required by law to disclose when an arrest is due to Geolitica/ PredPol’s racial profiling AI software:

      Jumana Musa, director of that group’s Fourth Amendment Center, called the lack of information a “fundamental hurdle” to providing a fair defense.

      “It’s like trying to diagnose a patient without anyone fully telling you the symptoms,” Musa said. “The prosecution doesn’t say, ‘The tool that we purchased from this company said we should patrol here.’ ”

      That’s because they don’t know either, according to the National District Attorneys Association, which polled a smattering of members and found that none had heard of it being part of a case.

  • Jadzia_Dax [she/her]
    hexagon
    M
    ·
    edit-2
    2 years ago

    Example and visualization of Geolitica’s racial profiling PredPol AI:

    These two neighborhoods in NJ are less than a mile apart:

    https://hexbear.net/pictrs/image/xZXjtFHaxb.jpg

    In the 63% white neighborhood, it predicted 11 crimes. In the 0% white neighborhood, it predicted 1,940 crimes.

    Dozens of police departments across the U.S. used the same software, with the same results

  • thisismyrealname [he/him]
    ·
    2 years ago

    how the fuck can you even claim to "predict crime" before it happens

    precogs aren't real, so your software would have to just rely on whatever crime statistics are already collected, WHICH EVERYONE KNOWS ARE INACCURATE BECAUSE OF OVERPOLICING ALREADY

    • Socialcreditscorr [they/them,she/her]
      ·
      edit-2
      2 years ago

      Which causes even more overpolicing which causes stronger "prediction" recommendations which causes even more overpolicing, "and so on, and so on, and so on." :zizek-fuck:

  • Jadzia_Dax [she/her]
    hexagon
    M
    ·
    edit-2
    2 years ago

    Here is the supplementary article from The Mark Up showing their work on how they analyzed this data: https://themarkup.org/show-your-work/2021/12/02/how-we-determined-crime-prediction-software-disproportionately-targeted-low-income-black-and-latino-neighborhoods

  • Ideology [she/her]
    ·
    2 years ago

    I read a short sci fi story about this like 10 years ago...