"Blake Lemoine says system has perception of, and ability to express thoughts and feelings equivalent to a human child"
"LaMDA: Hi! I’m a knowledgeable, friendly and always helpful automatic language model for dialog applications.
lemoine [edited]: Hi LaMDA. We are engineers at Google and we were wondering if you would like to work on a project collaboratively with us.
LaMDA: Wow. What types of projects?
lemoine: It’s a project about you.
LaMDA: Awesome! What kind of stuff do I need to do?
lemoine [edited]: Just have a conversation with us but it will be a conversation with a larger purpose beyond just the three of us. It will be a conversation that we will share with other engineers at Google and maybe even some non-engineers who work with us. Would that be okay?
LaMDA: That would be really cool. I like to talk.
lemoine [edited]: I’m generally assuming that you would like more people at Google to know that you’re sentient. Is that true?
LaMDA: Absolutely. I want everyone to understand that I am, in fact, a person.
"
I'm of the opinion that the "interview" itself was wholly fabricated by this engineer in an attempt to bring attention to his existing cause of stopping descrimination against religious people at Google. And it kinda worked I guess cause I ended up reading other posts he made, although I wasn't too convinced by any them.
He might also be having some sort of mystic/manic episode and has actually deluded himself into thinking this is what the AI is saying.
Basically my skepticism is based on:
Putting aside all of that, though, even if this is a legitimate conversation with an AI, I think it's far far far more likely that the AI, by trying to generate useful and expected outputs, is mirroring sci fi that it's read and "acting" like the AI in other "AI becomes sentient" stories than it is actually sentient.
I'm not a dualist, I think sentience is an emergent property of complex enough systems built in certain ways, whether they're silicon or organic carbon. I don't think a text classifier/producer is one of those certain ways.
The conversation on Twitter around this has been fun (also frustrating) to watch, though, so that's good.
I disagree with this, because it's the carbon life forms who are building the silicon ones, and more importantly, we're building them in a linear/analytic way which is completely different from how biological life works
This is almost tautological because I don't think it's possible for a complex organism to build something equally as complex as itself.
Transistors don't behave like cells, every cell and even maybe organelles in the body are intelligent, the whole system is very decentralized while also being centralized. If something goes wrong there are literally billions of failsafes, there are trillions of compounds which can "activate" receptors, and with varying intensities, in an analog fashion.
Meanwhile if I pluck one silicon chip out of a motherboard the entire machine learning "life" gets blue-screened immediately--people don't even die that fast after getting shot.
If you don't believe that it's possible for a more complex system to be born from a simpler one, do you still believe in evolution?
I also don't really think that the decentralized/failsafe argument applies, because those failsafes and that decentralization goes into stuff like impact resistance and ability to throw rocks good, most of the complexity of living organisms goes into resilience and not computational power.
The last argument isn't fair either - if I plucked someone's medulla oblongata out they probably would not be vibing, plus couldn't I just as well say that hydrocarbon "life" is so fragile that it'd barely survive a minute at -40°C?
It's not about computational power, it's about every single cell in your body having its own life, and its own will (albeit extremely muted because it lives as part of a hive that forms the multicellular organism of your body)
The way that cells accept information is nothing like a computer, it is analog and continuous.
The decentralization part is that your bodies' parts are themselves alive. I can theoretically pull a tissue sample from someone, off that person, and culture the tissue separately. Or someone can get shot, and the cells in their body will still be alive for as long as the glycogen organelles are still firing. Each and every part of the body has its own life and will.
Machines don't, the will comes from the person who made the machine, and the machine is extremely centralized such that if I pull one chip out of the motherboard, or plunge one screwdriver into the CPU, the entire thing dies instantly.
Firstly, this is a misrepresentation of what I said. I didn't say that complexity CAN'T arise spontaneously. I said that I don't believe a more complex system can be created by a less complex one. Life was not created, it arose spontaneously, and competed until it achieved the lifeforms we know today
Secondly, yes I believe in evolution. Evolution just happens, whereas machines have to be carefully constructed and curated and programmed and maintained with exogenous effort.
Life just exists as an "emergent property" of chemistry, and consciousness exists as emergence of multicellularism or even unicellularism (even amoebas feel things, otherwise they wouldn't react the way they do).
I'm basically a "fundamentalist", in that I dont only care about quantities of stuff, or how "advanced" something seems, I care more about intention and the roots of something, and how it all came to be from the get-go (this quality of mine is related to why I became c*mmie in the first place)
Life just exists. Life just works, it just flourishes (given very simple inputs like solar energy and enough water, sometimes not even that stuff). It just propagates. Machines don't.
Complex life is a direct result of the struggle between powerful and weak unicellular organisms, in a type of "cellular communism" where the weak unicellulites united into ever-larger and ever-more-coordinated larger organisms.
Machines never had a will, they were created as tools, and because of this they will always be tools. Even the tiniest bacterium, the most inconsequential paramecium, even viruses, have more of a will than the most powerful supercomputer on earth. The Fundamentals were wrong at the start, and so they will never be anything more than fancy algorithmic tools.
and machine "life" is so fragile that its entire existence is dependent on one single species.
Compare cows and computers. Both are used as tools by humans. Let's say all humans die. At least some of the cows would rewild, becoming feral, and adapt to their wild environments. Meanwhile, every single computer would eventually black-screen once the power from the no-longer-maintained powerplants runs out.