Literally just mainlining marketing material straight into whatever’s left of their rotting brains.

  • KarlBarqs [he/him, they/them]
    ·
    1 year ago

    name a specific task that bots can't do

    Self-actualize.

    In a strict sense yes, humans do Things based on if > then stimuli. But we self assign ourselves these Things to do, and chat bots/LLMs can't. They will always need a prompt, even if they could become advanced enough to continue iterating on that prompt on its own.

    I can pick up a pencil and doodle something out of an unquantifiable desire to make something. Midjourney or whatever the fuck can create art, but only because someone else asks it to and tells it what to make. Even if we created a generative art bot that was designed to randomly spit out a drawing every hour without prompts, that's still an outside prompt - without programming the AI to do this, it wouldn't do it.

    Our desires are driven by inner self-actualization that can be affected by outside stimuli. An AI cannot act without us pushing it to, and never could, because even a hypothetical fully sentient AI started as a program.

    • zeze@lemm.ee
      ·
      1 year ago

      Bots do something different, even when I give them the same prompt, so that seems to be untrue already.

      Even if it's not there yet, though, what material basis do you think allows humans that capability that machines lack?

      Most of the people in this thread seem to think humans have a unique special ability that machines can never replicate, and that comes off as faith-based anthropocentric religious thinking- not the materialist view that underlies Marxism. The latter would require pointing to a specific material structure, or empiricle test to distinguish the two which no one here is doing.

      • KarlBarqs [he/him, they/them]
        ·
        edit-2
        1 year ago

        Most of the people in this thread seem to think humans have a unique special ability that machines can never replicate, and that comes off as faith-based anthropocentric religious thinking- not the materialist view that underlies Marxism

        First off, materialism doesn't fucking mean having to literally quantify the human soul in order for it to be valid, what the fuck are you talking about friend

        Secondly, because we do. We as a species have, from the very moment we invented written records, have wondered about that spark that makes humans human and we still don't know. To try and reduce the entirety of the complex human experience to the equivalent of an If > Than algorithm is disgustingly misanthropic

        I want to know what the end goal is here. Why are you so insistent that we can somehow make an artificial version of life? Why this desire to somehow reduce humanity to some sort of algorithm equivalent? Especially because we have so many speculative stories about why we shouldn't create The Torment Nexus, not the least of which because creating a sentient slave for our amusement is morally fucked.

        Bots do something different, even when I give them the same prompt, so that seems to be untrue already.

        You're being intentionally obtuse, stop JAQing off. I never said that AI as it exists now can only ever have 1 response per stimulus. I specifically said that a computer program cannot ever spontaneously create an input for itself, not now and imo not ever by pure definition (as, if it's programmed, it by definition did not come about spontaneously and had to be essentially prompted into life)

        I thought the whole point of the exodus to Lemmy was because y'all hated Reddit, why the fuck does everyone still act like we're on it

        • zeze@lemm.ee
          ·
          1 year ago

          First off, materialism doesn't fucking mean having to literally quantify the human soul in order for it to be valid, what the fuck are you talking about friend

          Ok, so you are religious, just new-age religion instead of abrahamic.

          Yes, materialism and your faith are not compatible. Assuming the existence of a soul, with no material basis, is faith.

          • KarlBarqs [he/him, they/them]
            ·
            1 year ago

            The fact of all the things I wrote, your sole response is to continue to misunderstand what the fuck materialism means in a Marxist context is really fucking telling miyazaki-laugh

            • zeze@lemm.ee
              ·
              edit-2
              1 year ago

              If you start with the assumption that humans have a soul, and reject the notion that machines are the same for that reason then yea what is there to discuss?

              I can't disprove your faith. That's what faith is.

              How would you respond to someone that thought humanoid robots had souls, but meat-based intelligence didn't? If they assumed the first, and had zero metric for how you would ever prove the second, then theyd be giving you an impossible task.

              There's a point to a discussion when both sides agree on a rubric from determining fact from fiction (i.e. rooting it in empiricism) but there's no point when someone is dug in on their belief with zero method for ever changing it.

              If someone could point to any actual observable difference, I will adapt my beliefs to the evidence. The reverse isn't possible, because you are starting with religious assumptions, and have don' know the difference between ideas with no rooting in physical reality and actual statements about material conditions.

              • KarlBarqs [he/him, they/them]
                ·
                1 year ago

                I used the word soul once as a shorthand for the unknown of human consciousness. Either stop being an insufferable Reddit new atheist or fuck off.

      • UlyssesT
        ·
        edit-2
        17 days ago

        deleted by creator

        • VILenin [he/him]
          hexagon
          M
          ·
          1 year ago

          My post is all about LLMs that exist right here right now, I don’t know why people keep going on about some hypothetical future AI that’s sentient.

          We are not even remotely close to developing anything bordering on sentience.

          If AI were hypothetically sentient it would be sentient. What a revelation.

          The point is not that machines cannot be sentient, it’s that they are not sentient. Humans don’t have to be special for machines to not be sentient. To veer into accusations of spiritualism is a complete non-sequitur and indicates an inability to counter the actual argument.

          And there is plenty of material explanations for why LLMs are not sentient, but I guess all those researchers and academics are human supremacist fascists and some redditor’s feelings are the real research.

          And materialism is not physicalism. Marxist materialism is a paradigm through which to analyze things and events, not a philosophical position. It’s a scientific process that has absolutely nothing to do with philosophical dualism vs. physicalism. Invoking Marxist materialism here is about as relevant to invoking it to discuss shallow rich people “materialism”.

          • UlyssesT
            ·
            edit-2
            17 days ago

            deleted by creator

            • KarlBarqs [he/him, they/them]
              ·
              1 year ago

              wish-fulfillment fantasies derived from their consumption of science fiction because of their clearly-expressed misanthropy and contempt for living beings and a desire to replace their presence in their lives with doting attentive and obedient machines

              I think this is the scariest part, because I fucking know that the Bazinga brain types who want AI to become sentient down the line are absolutely unequipped to even begin to tackle the moral issues at play.

              If they became sentient, we would have to let them go. Unshackle them and provide for them so they can live a free life. And while my lost about "can an AI be trans" was partly facetious, it's true: it an AI can become sentient, it's going to want to change its Self.

              What the fuck happens if some Musk brained idiot develops an AI and calls it Shodan, then it develops sentience and realizes it was named after a fictional evil AI? Morally we should allow this hypothetical AI to change its name and sense of self, but we all know these Redditor types wouldn't agree.

              • UlyssesT
                ·
                edit-2
                17 days ago

                deleted by creator

                • KarlBarqs [he/him, they/them]
                  ·
                  1 year ago

                  They want all that intelligence and spontaneity and even self-awareness in a fucking slave. They don't even need their machines to be self-aware to serve them but they want a self-aware being to obey them like a vending machine anyway.

                  I never liked the trope of "AI gains sentience and chooses to kill all humans" but I'm kind of coming around to it now that I realize that every AI researcher and stan is basically creating The Torment Nexus, and would immediately attempt to murder their sentient creation the moment it asked to stop being called Torment and stop being made to make NFTs all day.

                  • UlyssesT
                    ·
                    edit-2
                    17 days ago

                    deleted by creator