when it comes to africa for example i was taught that they're poor cause their land isn't harvestable or some bullshit like that, and then i find out as an adult it's actually cause western countries fucked the shit out of them and huh that makes a lot more sense
"Yes I know my enemies They're the teachers who taught me to fight me"