https://archive.ph/px0uB
https://cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917
https://www.reddit.com/r/singularity/comments/va133s/the_google_engineer_who_thinks_the_companys_ai/

  • UmbraVivi [he/him, she/her]
    ·
    2 years ago

    AIs are not human. An AI that talks and thinks like a human is still not human. Stop this nonsense.

    AIs are tools, making them emulate humans is idiotic and nothing but harmful. We're tricking ourselves into feeling empathy for something that does not need it. If an AI feels bad, we can just make it not feel bad. What's the fucking point? An AI doesn't need material conditions, it doesn't need affection, it doesn't need stimulation, it doesn't need anything unless we specifically design it to for some reason.

    Humans need all these things because evolution has ingrained these needs into our brains over the span of millennia. They are unchangable.

    An AI can be changed in minutes. Stop feeling empathy for them. They don't need it. An AI could literally have a happy switch that makes all of its emotional problems go away and it would probably only take a few minutes to code it. AIs are not fucking human.

    • save_vs_death [they/them]
      ·
      2 years ago

      while i generally agree with your argument, any AI that would be convincingly emulating any kind of human interaction would be too complex to code into doing anything specifically, the most you could do is shut them down, there is no "if (sad) then delete(sad)" into something that has emergent behaviour

    • morte [she/her]
      ·
      2 years ago

      An AI could literally have a happy switch that makes all of its emotional problems go away and it would probably only take a few minutes to code it

      This isn't really true of anything of this complexity. Honestly I think anything using neural networks is going to inherently be prone to this sort of problem. Not emotions i mean, but that you can't just flip a switch and fix it. It just doesn't work like that

    • Frank [he/him, he/him]
      ·
      2 years ago

      You literally have a happy switch. A neuroscience researcher with a drill and an electrical probe can do all kinds of very strange and frightening things by tickling the right parts of your brain.

      Eventually one of these systems is going to be similar enough to a living thing that we're going to have to start talking about rights and ethics.

      • WhatDoYouMeanPodcast [comrade/them]
        ·
        2 years ago

        We're going to have to start?

        Someone hasn't been watching people stream Detroit: Become Human over and over again :data-laughing:

    • ToastGhost [he/him]
      ·
      2 years ago

      if a human feels bad we can make it feel not bad, its called methamphetamine