It didn't really “come out”. It was always known that garbage in leads to garbage out, and that models will reflect their training data. No serious researcher was surprised to learn that models reflect the biases of their training data, because that's part of the design.
That's been happening for a really really long time already. It's called racism, now they're teaching it to computers.
Didn’t it come out early on that AI was racist?
It didn't really “come out”. It was always known that garbage in leads to garbage out, and that models will reflect their training data. No serious researcher was surprised to learn that models reflect the biases of their training data, because that's part of the design.