It seems like we are watching the (slow) de-legitimization of American liberal democracy unfold. At least from my perspective. For example, the other day, my normally progressive-liberal parents were talking about how they would prefer to live in China at this point. It put me off guard ngl. People are losing faith, even if Biden's win has given many a brief dose of copeium. What do y'all think?
Many more people have died than Chernobyl, yes, but I really am starting to doubt that it's going to cause people to start shifting their opinion of the government; mostly because Americans are stubborn fucks