when it comes to africa for example i was taught that they're poor cause their land isn't harvestable or some bullshit like that, and then i find out as an adult it's actually cause western countries fucked the shit out of them and huh that makes a lot more sense
You had some good teachers by the sounds of it. Even if it doesn't tell the whole story it seems like at least an honest attempt to understand it.