True, but this is kind of the point. Datasets are trained on existing data, and so only serve to amplify and conserve the biases hidden within that data.
Fwiw there are also AI projects run by marginalized people dedicated to identifying those biases and flagging them, which will help with future training processes. It’s not an inherently conservative technology. Just, as with everything, it lacks those cultural filters due to the groups which tend to have access to it first
And in the case of the project I mentioned above, the tech is used to find those patterns in order to avoid them. Entrenched oppressive structures aren’t only about conserving patterns. They’re about preserving specific patterns and dampening others. For example, if this tech were available during the AIDS epidemic, there would have been plenty of data which revealed just how prevalent gay culture was in certain areas whereas the dominant ideology would insist that gay people were rare and unnatural. That data and the resulting analysis in the hands of police officers would have had very different outcomes from the same data/analysis in the hands of gay activists
Oh yes, definitely, that's why I added the second paragraph. I just wanted to mention the most likely reason why this happened.
The fact that many STEMheads and techbros don't bother with social justice makes it all the harder to check for biases though, because they rarely concern themselves with the ethical ramifications of the tech they develop, or just think that there's no way it could be used to reinforce oppressive systems.
True, but this is kind of the point. Datasets are trained on existing data, and so only serve to amplify and conserve the biases hidden within that data.
Fwiw there are also AI projects run by marginalized people dedicated to identifying those biases and flagging them, which will help with future training processes. It’s not an inherently conservative technology. Just, as with everything, it lacks those cultural filters due to the groups which tend to have access to it first
deleted by creator
And in the case of the project I mentioned above, the tech is used to find those patterns in order to avoid them. Entrenched oppressive structures aren’t only about conserving patterns. They’re about preserving specific patterns and dampening others. For example, if this tech were available during the AIDS epidemic, there would have been plenty of data which revealed just how prevalent gay culture was in certain areas whereas the dominant ideology would insist that gay people were rare and unnatural. That data and the resulting analysis in the hands of police officers would have had very different outcomes from the same data/analysis in the hands of gay activists
deleted by creator
Oh yes, definitely, that's why I added the second paragraph. I just wanted to mention the most likely reason why this happened.
The fact that many STEMheads and techbros don't bother with social justice makes it all the harder to check for biases though, because they rarely concern themselves with the ethical ramifications of the tech they develop, or just think that there's no way it could be used to reinforce oppressive systems.