The majority of Americans who voted, at least in the swing states, have voted for the republicans. Why? Do the republican policies reflect popular opinion? Or is it that their vibes are more aligned with the public? Or maybe people are worse off now than they were 4 years ago and are hoping to turn back time? As a non-american I don't quite get it. People must think their lives will materially improve under the republicans, but why?
Turns out the 'moderate republican White women upset over abortion' that Harris kept casing were at the end of the day.......still White Women gasp
Right? And they were all so fucking smug after those "Your husband won't know that you voted Democrat" letters.