I fucking hate living here. The culture is poison. The economy is a fucking disaster. The education is designed to leave a huge portion of the country illiterate. Every single atom of it is white supremacist. The land is all stolen. The capitalists are completely above the law. Fuck cars. Fuck America,
We need ways of breaking through the dynamic of how if you say all-American things are bad, you get seen as the bogeyman.
Failed, bankrupt, corrupted, deceptive, shallow.... There are lots of things we can say without outright saying "America bad".