guy recently linked this essay, its old, but i don't think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
guy recently linked this essay, its old, but i don't think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
I have no idea, but neither do you, which is kind of the point. How much of you is your brain, your body, your context?
Even if we can't put a number on it, I think it's trivial to assert that you are not just your brain. So, if you copied only your brain into some kind of computer, there would be parts of you that are missing because you left them behind with your meat.
Sure I do: that ratio does not exist, and no you don't get alienated from your material context if you have a prosthetic limb. We're made up of parts that perform functions, and we can remain ourselves despite the loss of large chunks of those - so long as someone remains alive and the brain keeps working they're still themself, they're still in there.
If someone could keep the functions of the brain ongoing and operating on machine bits they'd still be in there. It may be a transformative and lossy process, it may be unpleasant and imperfect in execution, but the same criticism applies to existing in general: at any point you may be forced out of your normal material context by circumstance and this is traumatic, you may lose the healthy function of large swathes of your body and this is traumatic, you may suffer brain damage and this is traumatic, you're constantly losing and overwriting memories and this can be normal or it can be traumatic, etc, but through it all you are you and you're still in there because ontologically speaking you're the ongoing continuation of yourself, distinct from all the component parts that you're constantly losing and replacing.
Does body dysmorphic disorder not exist? Or phantom limb? A full body prosthetic would undoubtedly be a difficult adjustment!
And would an upload be a person, legally speaking? Would your family consider the upload to be a person? That's pretty alienating.
I didn't say "you are perfectly happy and have no material problems whatsoever dealing with a traumatic injury and imperfect replacement," but rather that this doesn't represent some sort of fundamental loss of self or unmooring from material contexts. People can lose so much, can be whittled away to almost nothing all the way up to the point where continued existence becomes materially impossible due to a loss of vital functions, but through that they still exist, they remain the same ongoing being even if they are transformed by the trauma and become unrecognizable to others in the process.
If you suffer a traumatic brain injury and lose a large chunk of your brain, that's going to seriously affect you and how people perceive you, but you're still legally the same person. If instead that lost chunk was instead replaced with a synthetic copy there may still be problems but less so than just losing it outright. So if that continues until you're entirely running on the new synthetic replacement substrate, then you have continued to exist through the entire process just as you continue to exist through the natural death and replacement of neurons - for all we know copying and replacing may not even be necessary compared to just adding synthetic bits and letting them be integrated and subsumed into the brain by the same processes by which it grows and maintains itself.
A simple copy taken like a photograph and then spun up elsewhere would be something entirely distinct, no more oneself than a picture or a memoir.
Eh. I'd argue that in as much as "you" means anything, forks would both be equally the person, there's no "original" who is more the person. It's a point of divergence, both have equal claim to all experiences and history up to the point of divergence. Privileging the "original" over the "copy" is cultural prejudice, subjectively they're the same person to the moment of divergence.
I don't think that's the right way to untangle that dilemma ethically, because it can lead people to jump to galaxy brained "solutions" like "but what if you can make sure only one of you exists at once?" that don't make any sense or answer anything but are still active cognitohazards for some people.
You, as in the one that is in there right now, that instance would continue along its own discrete path for as long as it exists: if another instance were made and separated off that would be a person, that would be a non-contiguous you, but it would not be the same you that is there right now, a distinction that becomes important when dealing with cognitohazards like trying to terminate that instance as the new one is spun up so that "you" get to be the one in a machine instead and there's no perceptual break between them.
I'd argue that the ethical way to deal with forking copies like that would be to find ways to keep them linked up and at least partially synced, effectively making them component parts of a larger contiguous whole instead of just duplicating someone in a way that inevitably means at least one of the copies gets fucked over by whatever circumstances inspired the copying. So instead of the you that's here now and the you spun up in a decade on a computer, there'd be the you that's here now and then also a new secondary brain that's on that computer, both of which communicate, share some sensory data, and operate almost as if you'd just added more hemispheres to your brain. And at some point after that maybe you could start considering individual copies ablative the same way bits of brain are, things you don't want to lose but which you can survive losing and can potentially repair and replace given time because of how redundant and distributed brain functions are.
What I'm trying to say is your full body prosthetic would need to look like you, feel like you, sound like you, and have a legal life like you. Imagine if your name was Unit 69420, you looked and sounded like a Star Wars droid, and were legally considered property instead of a person. I think you would definitely experience a fundamental loss of self and become unmoored from material contexts.
"If shitty things happen to you, then you will not like that and it will suck," still doesn't break the continuity of self. Fundamentally that same exact thing can happen to the current flesh and blood you and it would be horrible and destructive: you can be disfigured through an accident or through someone's cruelty, you can be locked in a cage and dehumanized on the whim of the state-sanctioned professional violence men and given a farce of a trial by the dysfunctional shitshow that is the legal system, etc, but no one is going to argue that shitty things happening to you ontologically unpersons you in some sort of mystical fashion.
You can be reduced, you can be transformed, but you continue existing for as long as vital functions do. Talk about someone becoming someone else, or dying in truth long before they died in body, those are just poetic attempts at articulating sorrow and loss.
So I never was arguing that an upload becomes unpersoned by trauma. My point, the point of the article, is that by merely focusing on the brain we miss the other things that make us who we are.
The goal of an upload is to transfer the self to a machine, right? Well, parts of your self exist outside of your brain. It's no different than if an upload was missing parts of the brain. They're incomplete.
All that means is for some hypothetical future mind uploading technology, the process would need to include elements of the body and social life and society. Otherwise we're not complete.
I am not my brain. I am my brain, my body, my social life, my place in history, etc. I am the dialectical relationship between the personal and the impersonal.
And people survive all of that stuff, and are still people. I really don't understand what you're getting at here.
I never implied that uploaded people wouldn't be people. All I'm saying is that they'd be different people. It's not like putting on new clothes.
I'm getting the impression that you would think they were a different person, and I would not, and that disagreement lies not in any measurable process but rather our personal beliefs.
I'm getting the impression than you think you only exist in your head, whereas I do not, and this is a material disagreement on the self.
The measurable difference is that my conception of the self includes the embodied self and the social self. I am, in part, my body. I am, in part, my place in society. I am, in part, my relationships.
There is a dialectical relationship between the internal world inside our heads and the real world outside of it. Narrowly focusing on the brain misses this nuance.
This is just god of the gaps. "we don't know so it's not possible". Saying "just copy the brain" is a reductive understanding of what's being discussed. If we can model the brain then modelling the endocrine system is probably pretty trivial.
I didn't read it as being impossible? I think you could upload a human mind into a computer, but it can't just be their brain. Your mind, your phenomenal self, is more than just your brain because your brain isn't just a hard drive. That's what I took away from the article, anyway.
You are some mix of your brain, your body, and your context. Whatever upload magic exists would need all of that to work.
Aight I think we might be stuck in a semantics disagreement here. I'm using brain to mean the actual brain organ plus whatever other stuff is needed to support brain function - the endocrine system, nervous system, whatever. The physical systems of cognition in the body. i do not mean literally only the brain organ with no other systems.
I think I can relate this to being trans.
I was never at home in my body before and now I am. That's changed me a lot! My personality has shifted, my mannerisms, my habits, my attitude, my lifestyle, everything is so different! Changing my body changed my mind. A full body prosthetic would be the same.
We call them dead names for a reason.