It seems like we are watching the (slow) de-legitimization of American liberal democracy unfold. At least from my perspective. For example, the other day, my normally progressive-liberal parents were talking about how they would prefer to live in China at this point. It put me off guard ngl. People are losing faith, even if Biden's win has given many a brief dose of copeium. What do y'all think?
Removed by mod