Nerds are like “the algorithm can’t be racist it’s just code”
The racism isn’t in the fact that training AI to recognize people of varying skin color is difficult, the racism is in the fact that they know they’re deploying software which treats races differently and don’t see anything wrong with that. If they weren’t racist this shit would still be in the lab until it worked effectively for everyone. Dense motherfuckers.
Yeah but you’re not going to win that argument, these people have brainworms so bad that they’ll say shit like “Mein Kampf isn’t racist it’s just words on paper”
True, but in the case I don't think its the code, but the data. Probably the people involved in the development of this didnt even think about it, and just used whatever dataset was available, and most facial recognition datasets are heavily skewed white. Or if they knew and tried to bring it up to the decision-makers it got ignored.
Racism is not measured by intent, but by effect. If the output of the code is racist, then it’s racist. It doesn’t matter if the coders weren’t racist.
Yeah, but what I mean is that the deciding thing here is the data. Same piece of code could create very different models, based on what you feed it with. Very often this is what happens cause the engineers involved forget about the data part and think that because the code itself can't be racist the outcome also isn't
Nerds are like “the algorithm can’t be racist it’s just code”
The racism isn’t in the fact that training AI to recognize people of varying skin color is difficult, the racism is in the fact that they know they’re deploying software which treats races differently and don’t see anything wrong with that. If they weren’t racist this shit would still be in the lab until it worked effectively for everyone. Dense motherfuckers.
Also code can easily be racist. Laws can easily be racist. Books can easily be racist. Anything man made can be racist
Yeah but you’re not going to win that argument, these people have brainworms so bad that they’ll say shit like “Mein Kampf isn’t racist it’s just words on paper”
True, but in the case I don't think its the code, but the data. Probably the people involved in the development of this didnt even think about it, and just used whatever dataset was available, and most facial recognition datasets are heavily skewed white. Or if they knew and tried to bring it up to the decision-makers it got ignored.
Racism is not measured by intent, but by effect. If the output of the code is racist, then it’s racist. It doesn’t matter if the coders weren’t racist.
Yeah, but what I mean is that the deciding thing here is the data. Same piece of code could create very different models, based on what you feed it with. Very often this is what happens cause the engineers involved forget about the data part and think that because the code itself can't be racist the outcome also isn't