It seems like we are watching the (slow) de-legitimization of American liberal democracy unfold. At least from my perspective. For example, the other day, my normally progressive-liberal parents were talking about how they would prefer to live in China at this point. It put me off guard ngl. People are losing faith, even if Biden's win has given many a brief dose of copeium. What do y'all think?
Or us
deleted by creator
We aren't exactly movers and shakers imo, but we'll see if something catches on in the next couple of years