The majority of Americans who voted, at least in the swing states, have voted for the republicans. Why? Do the republican policies reflect popular opinion? Or is it that their vibes are more aligned with the public? Or maybe people are worse off now than they were 4 years ago and are hoping to turn back time? As a non-american I don't quite get it. People must think their lives will materially improve under the republicans, but why?
Democrats (well both major political parties in the US) are capitalist and, as such, are standing on the conveyor belt of capital interest, which constantly moves to the right. They would have to seriously fight against the forces of capital in order to just appear to stay roughly in one place, which they do not. The last time there was any meaningful resistance to capital interest were the Keynesian reforms to get out of the Great Depression, but that was really only to keep capitalism on life support long enough to roll them back.