True, but in the case I don't think its the code, but the data. Probably the people involved in the development of this didnt even think about it, and just used whatever dataset was available, and most facial recognition datasets are heavily skewed white. Or if they knew and tried to bring it up to the decision-makers it got ignored.
Racism is not measured by intent, but by effect. If the output of the code is racist, then it’s racist. It doesn’t matter if the coders weren’t racist.
Yeah, but what I mean is that the deciding thing here is the data. Same piece of code could create very different models, based on what you feed it with. Very often this is what happens cause the engineers involved forget about the data part and think that because the code itself can't be racist the outcome also isn't
deleted by creator
Also code can easily be racist. Laws can easily be racist. Books can easily be racist. Anything man made can be racist
deleted by creator
True, but in the case I don't think its the code, but the data. Probably the people involved in the development of this didnt even think about it, and just used whatever dataset was available, and most facial recognition datasets are heavily skewed white. Or if they knew and tried to bring it up to the decision-makers it got ignored.
Racism is not measured by intent, but by effect. If the output of the code is racist, then it’s racist. It doesn’t matter if the coders weren’t racist.
Yeah, but what I mean is that the deciding thing here is the data. Same piece of code could create very different models, based on what you feed it with. Very often this is what happens cause the engineers involved forget about the data part and think that because the code itself can't be racist the outcome also isn't