look, some of these posters are being maybe overly confrontational about this, but that blade runner point was basically entirely irrelevant. for one, the replicants in blade runner are mostly biological, more akin to edited clones than an algorithmic learning machine, definitely not a computer, and certainly nothing like a 2023 LLM chatbot. obviously a replicant could be conscious and sentient, as they are similar structurally to humans which are our one source of even somewhat reliable reports of subjectivity. but the film doesn't really interrogate any of the fundamental technical philosophical ideas like subjectivity and identity, or whether Qualia are intrinsic or relational, it just assumes answers to those questions and lets the drama play out. another example, with Data in star trek, is not relevant either, because Data is made with unknown and fictitious technologies and scientific theories, which could hypothetically account for and replicate consciousness instead of simply information processing. but the data example did reference the argument that, to paraphrase, goes 'if a machine was outwardly identical in behavior to humans, this is evidence that they are conscious or capable of subjectivity', when in actuality we can not necessarily know this from outward behavior, asssuming that it is hypothetically possible for all of our behaviors to be accounted for with information processing alone (which is the reductionist physicalist take being criticised by me and some users here). just like a statistical model of language use will not reveal (or create) the definitions of the terms of the language analyzed, so too would a statistical model of human behavior not reveal (or create) the subjective experience of that behavior. to use another analogy, if i make a fully detailed model of a bladder on a computer, it will never produce real piss on my desk, no matter how detailed my algorithm may be. in the same way, if i make a fully detailed model of a brain on a computer, it will not produce real subjectivity. we can use computers to perform solely information processing tasks, we cannot use them to create subjectivity any more than we can use them to create piss.
Look, I like sapient robots, they're cool, they're some of my favorite characters, but they are not and almost certainly never will be anything but science fiction.
Removed by mod
look, some of these posters are being maybe overly confrontational about this, but that blade runner point was basically entirely irrelevant. for one, the replicants in blade runner are mostly biological, more akin to edited clones than an algorithmic learning machine, definitely not a computer, and certainly nothing like a 2023 LLM chatbot. obviously a replicant could be conscious and sentient, as they are similar structurally to humans which are our one source of even somewhat reliable reports of subjectivity. but the film doesn't really interrogate any of the fundamental technical philosophical ideas like subjectivity and identity, or whether Qualia are intrinsic or relational, it just assumes answers to those questions and lets the drama play out. another example, with Data in star trek, is not relevant either, because Data is made with unknown and fictitious technologies and scientific theories, which could hypothetically account for and replicate consciousness instead of simply information processing. but the data example did reference the argument that, to paraphrase, goes 'if a machine was outwardly identical in behavior to humans, this is evidence that they are conscious or capable of subjectivity', when in actuality we can not necessarily know this from outward behavior, asssuming that it is hypothetically possible for all of our behaviors to be accounted for with information processing alone (which is the reductionist physicalist take being criticised by me and some users here). just like a statistical model of language use will not reveal (or create) the definitions of the terms of the language analyzed, so too would a statistical model of human behavior not reveal (or create) the subjective experience of that behavior. to use another analogy, if i make a fully detailed model of a bladder on a computer, it will never produce real piss on my desk, no matter how detailed my algorithm may be. in the same way, if i make a fully detailed model of a brain on a computer, it will not produce real subjectivity. we can use computers to perform solely information processing tasks, we cannot use them to create subjectivity any more than we can use them to create piss.
deleted by creator
deleted by creator
Look, I like sapient robots, they're cool, they're some of my favorite characters, but they are not and almost certainly never will be anything but science fiction.
deleted by creator