when it comes to africa for example i was taught that they're poor cause their land isn't harvestable or some bullshit like that, and then i find out as an adult it's actually cause western countries fucked the shit out of them and huh that makes a lot more sense
That's all you got? I was basically flat out told these were all "backwards" cultures full of "savages" and essentially lead to believe that without white people, they'd still be in the stone age. And I live in an area full of liberals
I often think about this when all these countries celebrate “independence day.” Not one country in the world is totally free right now. But if you just tell them they are well they must be.