Idk q whole lot about ai but the thing, to me, that seems somewhat concerning is the developers seem to usually have problems keeping their biases from infiltrating the ai
Like I’ve seen all these weird examples of how bias in the training data led to weird unexpected results and then I think, it’s probably mostly brainwormed ass labor aristocracy tech bros making these things and if one ever does go off rails it may do some super ai version of that thing middle class whites do where they think the black patron at a store works there
At least so far, we're really good at training an AI to do something the way we already do it, but training an AI to do something new or better is much more difficult (outside of a handful of applications like playing classic board games, we haven't got it figured out). That's why the notion of police and court systems using AI is so horrific, because it doesn't just "help overworked judges" or whatever, it permanently codes all of our currently-existing biases into the system while hiding them behind a layer of abstraction.
Yeah this is the exact shit that worries me. Alongside that abstraction is this weird idea that’s prevalent that like “well if the computer says it, it much be right, right?”
Just seems like the next logical step at putting accountability even further out of reach for the ruling class
Idk q whole lot about ai but the thing, to me, that seems somewhat concerning is the developers seem to usually have problems keeping their biases from infiltrating the ai
Like I’ve seen all these weird examples of how bias in the training data led to weird unexpected results and then I think, it’s probably mostly brainwormed ass labor aristocracy tech bros making these things and if one ever does go off rails it may do some super ai version of that thing middle class whites do where they think the black patron at a store works there
At least so far, we're really good at training an AI to do something the way we already do it, but training an AI to do something new or better is much more difficult (outside of a handful of applications like playing classic board games, we haven't got it figured out). That's why the notion of police and court systems using AI is so horrific, because it doesn't just "help overworked judges" or whatever, it permanently codes all of our currently-existing biases into the system while hiding them behind a layer of abstraction.
Yeah this is the exact shit that worries me. Alongside that abstraction is this weird idea that’s prevalent that like “well if the computer says it, it much be right, right?”
Just seems like the next logical step at putting accountability even further out of reach for the ruling class