The majority of Americans who voted, at least in the swing states, have voted for the republicans. Why? Do the republican policies reflect popular opinion? Or is it that their vibes are more aligned with the public? Or maybe people are worse off now than they were 4 years ago and are hoping to turn back time? As a non-american I don't quite get it. People must think their lives will materially improve under the republicans, but why?

  • Frank [he/him, he/him]
    ·
    2 months ago

    Right? And they were all so fucking smug after those "Your husband won't know that you voted Democrat" letters.