Don’t get me wrong, I’m not defending the current US healthcare system, it’s horrible and riddled with perverse incentives, and should be mostly (if not entirely) nationalized. I’m just not sure how to justify the idea that healthcare is a “right”.
I know that sometimes people on the left draw a comparison to the right to a public defender. I’m not sure that argument really holds up though, because you only have the right to a public defender under the specific circumstance of being prosecuted by the government for a crime. The logic there is “if the government is going to significantly interfere with your life by arresting you and trying you for a crime, then it at least has to allow you to get legal defense from a qualified attorney, even if you need the government to pay for it.” There’s not, like, a right to a publicly paid lawyer for any and all purposes.
But what if every nurse and doctor suddenly quit their job and didn't want to be healthcare workers anymore? Then you would have to force them which proves the argument!