Among other things such as the us expansion into the pacific and the oil embargo. But holy shit, love learning about history not taught in schools, very eye opening.
Among other things such as the us expansion into the pacific and the oil embargo. But holy shit, love learning about history not taught in schools, very eye opening.
Yes, that is actually very much what they did, only they didn't "start" then, they were already doing it before for decades because they were a literal empire. I swear, if people start defending imperial Japan from a woke perspective I'm gonna lose it.
I'm not defending them, at all, merely commenting that my USA education always painted us the good guy and our actions existing in a vacuum.
There was no more context in regards to this particular subject that somehow makes it different. Yes, true, it didn't come out of nowhere because they were already an imperialist country. Perhaps if a large number of things had gone different they wouldn't have sided with the Nazis, and they'd still be around oppressing Chinese, Koreans etc in the areas they already held. Maybe they'd have gotten a few more places too.
And I agree, Japan has arguably been an imperialist country for centuries dating back to like their warring states period and whatnot.