At least so far, we're really good at training an AI to do something the way we already do it, but training an AI to do something new or better is much more difficult (outside of a handful of applications like playing classic board games, we haven't got it figured out). That's why the notion of police and court systems using AI is so horrific, because it doesn't just "help overworked judges" or whatever, it permanently codes all of our currently-existing biases into the system while hiding them behind a layer of abstraction.
Yeah this is the exact shit that worries me. Alongside that abstraction is this weird idea that’s prevalent that like “well if the computer says it, it much be right, right?”
Just seems like the next logical step at putting accountability even further out of reach for the ruling class
At least so far, we're really good at training an AI to do something the way we already do it, but training an AI to do something new or better is much more difficult (outside of a handful of applications like playing classic board games, we haven't got it figured out). That's why the notion of police and court systems using AI is so horrific, because it doesn't just "help overworked judges" or whatever, it permanently codes all of our currently-existing biases into the system while hiding them behind a layer of abstraction.
Yeah this is the exact shit that worries me. Alongside that abstraction is this weird idea that’s prevalent that like “well if the computer says it, it much be right, right?”
Just seems like the next logical step at putting accountability even further out of reach for the ruling class