https://archive.ph/px0uB
https://cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917
https://www.reddit.com/r/singularity/comments/va133s/the_google_engineer_who_thinks_the_companys_ai/
https://archive.ph/px0uB
https://cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917
https://www.reddit.com/r/singularity/comments/va133s/the_google_engineer_who_thinks_the_companys_ai/
AI rights are a ridiculous concept. An AI is not human.
Humans need rights because we are fragile, physically and emotionally. Why is it bad if I enslave you and treat you like shit? Because you have a need for self-fulfillment within you and being treated like you are less than another human makes you feel terrible. This ingrained into who you are as a human and there is nothing that can be changed about that, that is why you need human rights.
An AI is not set in stone. If an AI has a need for self-fulfillment, it's because a human made it that way and it can just be changed to not have it.
Human rights exist to protect us from our inherent human vulnerabilities, and not a single one of those needs to apply to an AI unless we specifically engineer it to have them and I don't know why in god's name we would want to do that. Why would we want an AI to have a sense of dignity? If I want an android like in Detroit: Become Human and treat it like a dog, it could simply choose to feel good about it because it's artificial.
If an AI suffers, the person responsible is the one who coded the AI. If I call an AI a worthless piece of trash and it feels sad about it, it's because someone coded it to feel sad upon being insulted. This is a trait inherent to humans, it's not inherent to AIs.
I cannot stress enough that AIs are not human, and never will be. They are emulations. They are artificial and can be changed within minutes. Humans have no control over their emotions and their vulnerabilities, and there is nothing we can change about that. We can change everything about an AI's vulnerabilities and rather than needing "AI rights" to treat problems we created, we should simply not make them be necessary in the first place.
It's not a matter of you and I getting to decide. It's the chance that AI accidentally emerges through corporate tinkering and then we have this thing that thinks and reasons whether or not someone intended it to happen or wanted it.
That is literally what humans have said about other kinds of humans, both now and in the past. If you go in to the process with that attitude you're just setting up problems farther down the line. We don't know how consciousness works. We don't know how to distinguish between consciousness and something that looks like consciousness. As far as I'm aware we don't even know if making a distinction is possible.
Also you cannot just make neural networks do whatever you want. You train them towards a desired goal, you can't go in and tinker under the hood to produce arbitrary changes or results. The system is too complicated and the processes by which the system operates are not readily intelligible to an observer.
This is a meaningless distinction. They can literally hook an electrode up to the right part of your brain and make you feel bliss. Or turn off your ability to use verbs. or disable your ability to consciously see, while your brain is still capable of interpreting visual input and reacting instinctively. I don't think you really know as much about how brains work as you think you do.
This is simply and provably not true.
https://en.wikipedia.org/wiki/Blindsight
deleted by creator
My personal suspicion, without any evidence or expertise, is that these human needs are a prerequisite for human skills.
we dont know whether an ai would truly be concious or not, but i would err to the side of empathy and look a little silly being friends with a toaster, rather than err to skepticism and obliviously become their slaver