PredPol (now renamed Geolitica) built racial profiling and class profiling AI software and then sold it to police departments along with policing and behavioral “recommendations” that included harassment of entire communities.


Markup reporting (includes data visualization): https://themarkup.org/prediction-bias/2021/12/02/crime-prediction-software-promised-to-be-free-of-biases-new-data-shows-it-perpetuates-them

Gizmodo story here: https://gizmodo.com/crime-prediction-software-promised-to-be-free-of-biases-1848138977

Markup article outlining their methodology for how they analyzed this data: https://themarkup.org/show-your-work/2021/12/02/how-we-determined-crime-prediction-software-disproportionately-targeted-low-income-black-and-latino-neighborhoods


This story broke at the end of last year and I somehow missed it.

The founders of Geolitica / PredPol have known their software racially profiled since an independent study was published in 2018. They declined to update their algorithm, and withheld the information from police departments they sold their software to.


Link to independent study showing clear racial and class bias of Geolitica’s PredPol AI: https://ieeexplore.ieee.org/abstract/document/8616417


The author’s of the study even provided Geolitica with potential fixes to the algorithm. The company declined to implement these fixes because the fixes would have decreased the amount of racial profiling recommendations (ie “cRiMe pReDicTiOns”)

Even worse, a number of police departments were still using this software at the time the report broke.

:acab-3: :acab:

There is a ton of information in all of these reports and it’s worth reading through separate from my summary, more high lights in the comments. Incredible journalism from Gizmodo and The Markup.

  • Jadzia_Dax [she/her]
    hexagon
    M
    ·
    2 years ago

    Between 2018 and 2021, more than one in 33 U.S. residents were potentially subject to police patrol decisions directed by crime prediction software called PredPol.

    “I would liken it to policing bias-by-proxy,” Elgin Police Department deputy chief Adam Schuessler said in an interview. The department has stopped using the software.

    Surprising a pig took called that out and even stopped using the software.

    • Jadzia_Dax [she/her]
      hexagon
      M
      ·
      edit-2
      2 years ago

      Police aren’t required by law to disclose when an arrest is due to Geolitica/ PredPol’s racial profiling AI software:

      Jumana Musa, director of that group’s Fourth Amendment Center, called the lack of information a “fundamental hurdle” to providing a fair defense.

      “It’s like trying to diagnose a patient without anyone fully telling you the symptoms,” Musa said. “The prosecution doesn’t say, ‘The tool that we purchased from this company said we should patrol here.’ ”

      That’s because they don’t know either, according to the National District Attorneys Association, which polled a smattering of members and found that none had heard of it being part of a case.