This is why it's important to talk about the tricky stuff like the hard problem of consciousness, even if there's no definitive answer as of yet.
Too many people (even in marxist circles) have really jumbled up ideas in their heads about what intelligence and consciousness really means rarely based on anything more other than the (really shitty) intuitions people get from sci-fi movies, techbro pitches and pop-sci articles about AI.
The tricky part is that while we don't know what it is, we do know that it's not some specific part. It's an emergent property of some arrangement of non-conscious parts, independent of whether those parts are meat or sand. Can a GAN develop consciousness in any meaningful way? No, but the idea of consciousness emerging from a computer isn't entirely unthinkable, it's just unlikely.
I definitely get a sense of people generally believing consciousness is like a big dial that you keep cranking and that we just dont have the tech to crank it to 11 yet.
Every single sci fi story with robots that have even slightly human characteristics - the "woah, dude!" point is actually the robots are have humanity. Very novel in 2001: A Space Odyssey. But it's very tired in 2023 now that the concept's been replicated 20,000x.
"Astrology and tarot is bullshit! Its all just confirmation bias, people looking for signs that its real!"
:so-true: "HOLY SHIT THE AI IS SENTIENT, DUDE ITS TOTALLY SAD AND FEELS PAIN, THIS IS JUST LIKE MY SCI FI MOVIES, WHEN WILL HUMANITY LEARN!?"
Edit: Corrected novels to movies because I dont think this type of guy reads.
This is why it's important to talk about the tricky stuff like the hard problem of consciousness, even if there's no definitive answer as of yet.
Too many people (even in marxist circles) have really jumbled up ideas in their heads about what intelligence and consciousness really means rarely based on anything more other than the (really shitty) intuitions people get from sci-fi movies, techbro pitches and pop-sci articles about AI.
The tricky part is that while we don't know what it is, we do know that it's not some specific part. It's an emergent property of some arrangement of non-conscious parts, independent of whether those parts are meat or sand. Can a GAN develop consciousness in any meaningful way? No, but the idea of consciousness emerging from a computer isn't entirely unthinkable, it's just unlikely.
I definitely get a sense of people generally believing consciousness is like a big dial that you keep cranking and that we just dont have the tech to crank it to 11 yet.
deleted by creator
deleted by creator
Every single sci fi story with robots that have even slightly human characteristics - the "woah, dude!" point is actually the robots are have humanity. Very novel in 2001: A Space Odyssey. But it's very tired in 2023 now that the concept's been replicated 20,000x.
yeah if you start to talk to the bing robot in really dramatic terms, it'll start to answer stuff in dramatic terms too.
deleted by creator