The majority of Americans who voted, at least in the swing states, have voted for the republicans. Why? Do the republican policies reflect popular opinion? Or is it that their vibes are more aligned with the public? Or maybe people are worse off now than they were 4 years ago and are hoping to turn back time? As a non-american I don't quite get it. People must think their lives will materially improve under the republicans, but why?
Iirc Biden championed the law that made it illegal to discharge student debt. Like it was his baby back in ~2005 or something. People really thought he was going to unravell his own legacy and then believed him when he said his hands were tied.