The majority of Americans who voted, at least in the swing states, have voted for the republicans. Why? Do the republican policies reflect popular opinion? Or is it that their vibes are more aligned with the public? Or maybe people are worse off now than they were 4 years ago and are hoping to turn back time? As a non-american I don't quite get it. People must think their lives will materially improve under the republicans, but why?
I disagree that Harris couldn't distance herself from the Biden Admin. She chose not to. She chose to constantly say that she agreed with Biden on everything and wouldn't do anything different. Her campaign started off with voters broadly agreeing that Biden's economy wasn't to blame on Harris, but she made the decision to embrace inflation, zionism and everything else.
But broadly speaking yeah, it seems we've entered the era where the incumbent eats shit every 4 years. Nobody is gonna fundamentally change anything, so the ship will keep careening towards disaster.
Her biggest problem is that she can't hide the fact that she's a patsy