In which we are joined by Ezri of Swampside Chats, to continue our discussion of "Computer Power and Human Reason: From Judgement to Calculation" by Joseph Weizenbaum.
Computer Power and Human Reason: From Judgment to Calculation (1976) by Joseph Weizenbaum displays the author's ambivalence towards computer technology and lays out the case that while artificial intelligence may be possible, we should never allow computers to make important decisions because computers will always lack human qualities such as compassion and wisdom.
Weizenbaum makes the crucial distinction between deciding and choosing. Deciding is a computational activity, something that can ultimately be programmed. It is the capacity to choose that ultimately makes one a human being. Choice, however, is the product of judgment, not calculation. Comprehensive human judgment is able to include non-mathematical factors such as emotions. Judgment can compare apples and oranges, and can do so without quantifying each fruit type and then reductively quantifying each to factors necessary for mathematical comparison.
If you like the show, consider supporting us on Patreon.
Links:
Computer Power and Human Reason on Wikipedia
Weizenbaum's Nightmares, on The Guardian
Inside the Very Human Origin of the Term “Artificial Intelligence”
General Intellect Unit on iTunes
http://generalintellectunit.net
Support the show on Patreon
https://twitter.com/giunitpod
General Intellect Unit on Facebook
General Intellect Unit on archive.org
Emancipation Network
The problem as i see it (and im not a psychologist or whatever) is you dont have feeling towards your mirror for example, your brain adapted to your reflection not being a real thing at like 2-3 years.
Brain doesn't have natural defenses against empathising with llm (even with eliza people were ready to go tell the program their secrets). And feeling aren't logical (as in, you can know its bullshit and still feel some fulfillment from such conversations). They will (in the podcast) prolly discuss what author thought of that phenomenon with eliza, but i can see on a large scale that being a problem with atomized society, that noticeable amount of people will drop out into llm fantasies.
I don't think there is a danger in rejecting empathy. I like some plush toys from my childhood, i would be hurt if something happened to them, i wouldn't hurt them, but i also don't empathize with them.