It seems like we are watching the (slow) de-legitimization of American liberal democracy unfold. At least from my perspective. For example, the other day, my normally progressive-liberal parents were talking about how they would prefer to live in China at this point. It put me off guard ngl. People are losing faith, even if Biden's win has given many a brief dose of copeium. What do y'all think?
You aren't wrong there about 2016. The energy Trump has put out there is not going away. We started seeing cracks in the trust of our liberal democracy around that time.
Biden isn't going to be able to fix this pandemic. He don't support lockdowns and he has flipflopped so far on mask mandates. The big thing he is going to screw up will be the COVID relief bill, which will likely just be tax credits and come at the expense of him getting to finally defund social security or medicaid after working his whole life to it. Biden will be the austerity president and the backlash to this is going to some serious anti-government sentiment unlike anything we have ever seen before. Biden very well might just destroy the Democratic party all together. There's no way they're going to win midterms and 2024 is shaping up to be the year we get a competent fascist to arise from the GOP.