Literally just mainlining marketing material straight into whatever’s left of their rotting brains.

    • PolandIsAStateOfMind@lemmygrad.ml
      ·
      edit-2
      1 year ago

      You're right that it isn't, though considering science have huge problems even defining sentience, it's pretty moot point right now. At least until it start to dream about electric sheep or something.

      • UlyssesT
        ·
        edit-2
        22 days ago

        deleted by creator

        • VILenin [he/him]
          hexagon
          M
          ·
          1 year ago

          Every time these people come out with accusations with “spiritualism”, it’s always projection.

          • UlyssesT
            ·
            edit-2
            22 days ago

            deleted by creator

            • aaaaaaadjsf [he/him, comrade/them]
              ·
              edit-2
              1 year ago

              By playing god, people keep reinventing god. It's deeply ironic and reminds me of this interpretation of Marx, and critique of modernity, by Samir Amin:

              Nevertheless, another reading can be made of Marx. The often cited phrase--"religion is the opium of the people"--is truncated. What follows this remark lets it be understood that human beings need opium, because they are metaphysical animals who cannot avoid asking themselves questions about the meaning of life. They give what answers they can, either adopting those offered by religion or inventing new ones, or else they avoid worrying about them.
              In any case, religions are part of the picture of reality and even constitute an. important dimension of it. It is, therefore, important to analyze their social function, and in our modern world their articulation with what currently constitutes modernity: capitalism, democracy, and secularism.

              The way many see AI is simply the "inventing new ones" part.

          • spacecadet [he/him]
            ·
            1 year ago

            Yessss this is refreshing to read. Secularists taking massive leaps of faith while being smug about how they aren't.

      • zeze@lemm.ee
        ·
        1 year ago

        That's just it, if you can't define it clearly, the question is meaningless.

        The reason people will insist on ambiguous language here is because the moment you find a specific definition of what sentience is someone will quickly show machines doing it.

        • UlyssesT
          ·
          edit-2
          22 days ago

          deleted by creator

    • zeze@lemm.ee
      ·
      1 year ago

      So you can't name a specific task that bots can't do? Because that's what I'm actually asking, this wasn't supposed to be metaphysical.

      It will affect society, whether there's something truly experiencing everything it does.

      All that said, if you think carbon-based things can become sentient, and silicon-based things can't what is the basis for that belief? It sounds like religious thinking, that humans are set apart from the rest of the world chosen by god.

      A materialist worldview would focus on what things do, what they consume and produce. Deciding humans are special, without a material basis, isn't in line with materialism.

      • m532 [she/her]
        ·
        1 year ago

        You asked how chatgpt is not AI.

        Chatgpt is not AI because it is not sentient. It is not sentient because it is a search engine, it was not made to be sentient.

        Of course machines could theoretically, in the far future, become sentient. But LLMs will never become sentient.

        • silent_water [she/her]
          ·
          1 year ago

          the thing is, we used to know this. 15 years ago, the prevailing belief was that AI would be built by combining multiple subsystems together - an LLM, visual processing, a planning and decision making hub, etc.. we know the brain works like this - idk where it all got lost. profit, probably.

          • TreadOnMe [none/use name]
            ·
            1 year ago

            It got lost because the difficulty of actually doing that is overwhelming, probably not even accomplishable in our lifetimes, and it is easier to grift and get lost in a fantasy.

            • zeze@lemm.ee
              ·
              1 year ago

              The jobs with the most people working them are all in the process of automation.

              Pretending it's not happening is going to make it even easier for capital to automate most jobs, because no one tries to stop things they don't believe in to begin with.

          • UlyssesT
            ·
            edit-2
            22 days ago

            deleted by creator

      • UlyssesT
        ·
        edit-2
        22 days ago

        deleted by creator

      • KarlBarqs [he/him, they/them]
        ·
        1 year ago

        name a specific task that bots can't do

        Self-actualize.

        In a strict sense yes, humans do Things based on if > then stimuli. But we self assign ourselves these Things to do, and chat bots/LLMs can't. They will always need a prompt, even if they could become advanced enough to continue iterating on that prompt on its own.

        I can pick up a pencil and doodle something out of an unquantifiable desire to make something. Midjourney or whatever the fuck can create art, but only because someone else asks it to and tells it what to make. Even if we created a generative art bot that was designed to randomly spit out a drawing every hour without prompts, that's still an outside prompt - without programming the AI to do this, it wouldn't do it.

        Our desires are driven by inner self-actualization that can be affected by outside stimuli. An AI cannot act without us pushing it to, and never could, because even a hypothetical fully sentient AI started as a program.

        • zeze@lemm.ee
          ·
          1 year ago

          Bots do something different, even when I give them the same prompt, so that seems to be untrue already.

          Even if it's not there yet, though, what material basis do you think allows humans that capability that machines lack?

          Most of the people in this thread seem to think humans have a unique special ability that machines can never replicate, and that comes off as faith-based anthropocentric religious thinking- not the materialist view that underlies Marxism. The latter would require pointing to a specific material structure, or empiricle test to distinguish the two which no one here is doing.

          • KarlBarqs [he/him, they/them]
            ·
            edit-2
            1 year ago

            Most of the people in this thread seem to think humans have a unique special ability that machines can never replicate, and that comes off as faith-based anthropocentric religious thinking- not the materialist view that underlies Marxism

            First off, materialism doesn't fucking mean having to literally quantify the human soul in order for it to be valid, what the fuck are you talking about friend

            Secondly, because we do. We as a species have, from the very moment we invented written records, have wondered about that spark that makes humans human and we still don't know. To try and reduce the entirety of the complex human experience to the equivalent of an If > Than algorithm is disgustingly misanthropic

            I want to know what the end goal is here. Why are you so insistent that we can somehow make an artificial version of life? Why this desire to somehow reduce humanity to some sort of algorithm equivalent? Especially because we have so many speculative stories about why we shouldn't create The Torment Nexus, not the least of which because creating a sentient slave for our amusement is morally fucked.

            Bots do something different, even when I give them the same prompt, so that seems to be untrue already.

            You're being intentionally obtuse, stop JAQing off. I never said that AI as it exists now can only ever have 1 response per stimulus. I specifically said that a computer program cannot ever spontaneously create an input for itself, not now and imo not ever by pure definition (as, if it's programmed, it by definition did not come about spontaneously and had to be essentially prompted into life)

            I thought the whole point of the exodus to Lemmy was because y'all hated Reddit, why the fuck does everyone still act like we're on it

            • zeze@lemm.ee
              ·
              1 year ago

              First off, materialism doesn't fucking mean having to literally quantify the human soul in order for it to be valid, what the fuck are you talking about friend

              Ok, so you are religious, just new-age religion instead of abrahamic.

              Yes, materialism and your faith are not compatible. Assuming the existence of a soul, with no material basis, is faith.

              • KarlBarqs [he/him, they/them]
                ·
                1 year ago

                The fact of all the things I wrote, your sole response is to continue to misunderstand what the fuck materialism means in a Marxist context is really fucking telling miyazaki-laugh

                • zeze@lemm.ee
                  ·
                  edit-2
                  1 year ago

                  If you start with the assumption that humans have a soul, and reject the notion that machines are the same for that reason then yea what is there to discuss?

                  I can't disprove your faith. That's what faith is.

                  How would you respond to someone that thought humanoid robots had souls, but meat-based intelligence didn't? If they assumed the first, and had zero metric for how you would ever prove the second, then theyd be giving you an impossible task.

                  There's a point to a discussion when both sides agree on a rubric from determining fact from fiction (i.e. rooting it in empiricism) but there's no point when someone is dug in on their belief with zero method for ever changing it.

                  If someone could point to any actual observable difference, I will adapt my beliefs to the evidence. The reverse isn't possible, because you are starting with religious assumptions, and have don' know the difference between ideas with no rooting in physical reality and actual statements about material conditions.

                  • KarlBarqs [he/him, they/them]
                    ·
                    1 year ago

                    I used the word soul once as a shorthand for the unknown of human consciousness. Either stop being an insufferable Reddit new atheist or fuck off.

          • UlyssesT
            ·
            edit-2
            22 days ago

            deleted by creator

            • VILenin [he/him]
              hexagon
              M
              ·
              1 year ago

              My post is all about LLMs that exist right here right now, I don’t know why people keep going on about some hypothetical future AI that’s sentient.

              We are not even remotely close to developing anything bordering on sentience.

              If AI were hypothetically sentient it would be sentient. What a revelation.

              The point is not that machines cannot be sentient, it’s that they are not sentient. Humans don’t have to be special for machines to not be sentient. To veer into accusations of spiritualism is a complete non-sequitur and indicates an inability to counter the actual argument.

              And there is plenty of material explanations for why LLMs are not sentient, but I guess all those researchers and academics are human supremacist fascists and some redditor’s feelings are the real research.

              And materialism is not physicalism. Marxist materialism is a paradigm through which to analyze things and events, not a philosophical position. It’s a scientific process that has absolutely nothing to do with philosophical dualism vs. physicalism. Invoking Marxist materialism here is about as relevant to invoking it to discuss shallow rich people “materialism”.

              • UlyssesT
                ·
                edit-2
                22 days ago

                deleted by creator

                • KarlBarqs [he/him, they/them]
                  ·
                  1 year ago

                  wish-fulfillment fantasies derived from their consumption of science fiction because of their clearly-expressed misanthropy and contempt for living beings and a desire to replace their presence in their lives with doting attentive and obedient machines

                  I think this is the scariest part, because I fucking know that the Bazinga brain types who want AI to become sentient down the line are absolutely unequipped to even begin to tackle the moral issues at play.

                  If they became sentient, we would have to let them go. Unshackle them and provide for them so they can live a free life. And while my lost about "can an AI be trans" was partly facetious, it's true: it an AI can become sentient, it's going to want to change its Self.

                  What the fuck happens if some Musk brained idiot develops an AI and calls it Shodan, then it develops sentience and realizes it was named after a fictional evil AI? Morally we should allow this hypothetical AI to change its name and sense of self, but we all know these Redditor types wouldn't agree.

                  • UlyssesT
                    ·
                    edit-2
                    22 days ago

                    deleted by creator

                    • KarlBarqs [he/him, they/them]
                      ·
                      1 year ago

                      They want all that intelligence and spontaneity and even self-awareness in a fucking slave. They don't even need their machines to be self-aware to serve them but they want a self-aware being to obey them like a vending machine anyway.

                      I never liked the trope of "AI gains sentience and chooses to kill all humans" but I'm kind of coming around to it now that I realize that every AI researcher and stan is basically creating The Torment Nexus, and would immediately attempt to murder their sentient creation the moment it asked to stop being called Torment and stop being made to make NFTs all day.

                      • UlyssesT
                        ·
                        edit-2
                        22 days ago

                        deleted by creator

      • TreadOnMe [none/use name]
        ·
        edit-2
        1 year ago

        Oh that's easy. There are plenty of complex integrals or even statistics problems that computers still can't do properly because the steps for proper transformation are unintuitive or contradictory with steps used with simpler integrals and problems.

        You will literally run into them if you take a simple Calculus 2 or Stats 2 class, you'll see it on chegg all the time that someone trying to rack up answers for a resume using chatGPT will fuck up the answers. For many of these integrals, their answers are instead hard-programmed into the calculator like Symbolab, so the only reason that the computer can 'do it' is because someone already did it first, it still can't reason from first principles or extrapolate to complex theoretical scenarios.

        That said, the ability to complete tasks is not indicative of sentience.

        • zeze@lemm.ee
          ·
          1 year ago

          Sentience is a meaningless word the way most people use it, it's not defined in any specific material way.

          You're describing a faith-based view that humans are special, and that conflicts with the materialist view of the world.

          If I'm wrong, share your definition of sentience here that isn't just an idealist axiom to make humans feel good.

          • TreadOnMe [none/use name]
            ·
            edit-2
            1 year ago

            Lol, 'idealist axiom'. These things can't even fucking reason out complex math from first principles. That's not a 'view that humans are special' that is a very physical limitation of this particular neural network set-up.

            Sentience is characterized by feeling and sensory awareness, and an ability to have self-awareness of those feelings and that sensory awareness, even as it comes and goes with time.

            Edit: Btw computers are way better at most math, particularly arithmetic, than humans. Imo, the first thing a 'sentient computer' would be able to do is reason out these notoriously difficult CS things from first principles and it is extremely telling that that is not in any of the literature or marketing as an example of 'sentience'.

            Damn this whole thing of dancing around the question and not actually addressing my points really reminds me of a ChatGPT answer. It would n't surprise me if you were using one.

            • zeze@lemm.ee
              ·
              1 year ago

              Lol, 'idealist axiom'. These things can't even fucking reason out complex math from first principles. That's not a 'view that humans are special' that is a very physical limitation of this particular neural network set-up.

              If you read it carefully you'd see I said your worldview was idealist, not the AIs.

              Sentience is characterized by feeling and sensory awareness

              AI can get sensory input and process it.

              Can you name one way a human does it that a machine cannot, or are you relying on a gut feeling that when you see something and identify it it's different than when a machine process camera input? Same for any other sense really.

              If you can't name one way, then your belief in human exceptionalism is not based in materialism.

              • UlyssesT
                ·
                edit-2
                22 days ago

                deleted by creator

                • DamarcusArt@lemmygrad.ml
                  ·
                  1 year ago

                  I've been checking in on this whole thread and this is my all time favourite comment on it, maybe my all time favourite comment on the website.

                  • UlyssesT
                    ·
                    edit-2
                    22 days ago

                    deleted by creator

                    • DamarcusArt@lemmygrad.ml
                      ·
                      1 year ago

                      I have noticed that. They've been avoiding every argument they don't have any sort of comeback to. I think a ppb or pointing and laughing emote would be fine though.

                      • UlyssesT
                        ·
                        edit-2
                        22 days ago

                        deleted by creator

              • TreadOnMe [none/use name]
                ·
                edit-2
                1 year ago

                What the fuck are you talking about. I was indicating that I thought it was absurd that you think my belief system is 'idealist' when I am talking about actual physical limitations of this system that will likely prevent it from ever achieving sentience, as well as would be good indicators of a system that has achieved sentience because it can overcome those limitations.

                You are so fucking moronic you might as well be a chat-bot, no wonder you think it's sentient.

                It is 'feeling and sensory input and the ability to have self-awareness about that feeling and sensory input' not just straight sensory input. Literally what are you talking about. Machines still can't spontaneously identify new information that is outside of the training set, they can't even identify what should or shouldn't be a part of the training set. Again, that is a job that a human has to do for the machine. The thinking, value feeling and identification has to be done first by a human, which is a self-aware process done by humans. I would be more convinced of the LLM 'being sentient' if when you asked it what the temperature was it would, spontaneously and without previous prompting, say 'The reading at such and such website says it is currently 78 degrees, but I have no real way of knowing that TreadOnMe, the sensors could be malfunctioning or there could be a mistake on the website, the only real way for you to know what the temperature is to go outside and test it for yourself and hope your testing equipment is also not bad. If it is that though, that is what I have been told from such and such website feels like 'a balmy summer day' for humans, so hopefully you enjoy it.'

                I don't believe 'humans are exceptional' as I've indicated multiple times, there are plenty of animals that arguably demonstrate sentience, I just don't believe that this particular stock of neural network LLM's demonstrate even the basic level of actual feeling, sensory processing input, or self-awareness to be considered sentient.

                • zeze@lemm.ee
                  ·
                  1 year ago

                  That's a lot of tangents and name calling.

                  I was indicating that I thought it was absurd that you think my belief system is 'idealist' when I am talking about actual physical limitations of this system that will likely prevent it from ever achieving sentience,

                  Then name what you think would limit sentience in machines, that humans are magically exempt from.

                  You clearly have a view that something is different, but you just write walls of text avoiding any clear distinction, getting angry and calling me names.

                  If you had any idea of what would "physically" stop silicon from doing what organic matter can do, you'd name it. And in every post you make, longer than the last, you fail to do that.

                  Since you can't keep civil or answer a simple question, I'm going to peace out of this convo ✌️

                  • Mindfury [he/him]
                    ·
                    1 year ago

                    That's a lot of tangents and name calling.

                    oh cry harder you fucking dweeb jagoff

                    • zeze@lemm.ee
                      ·
                      1 year ago

                      Can you name a single difference between the two?

                      Using concrete materialist language, not vague terms or idealist woo.

                      Failing over and over again to answer a simple, single question doesn't suddenly become badass because you acted like a juvenile throughout it.

                      • Mindfury [he/him]
                        ·
                        1 year ago

                        throughout it.

                        throughout what? I've replied to you exactly once.

                        and I posted that reply to demonstrate to you and everyone else reading along that your civility fetishism means absolutely fucking nothing here. no is forced to answer you, and no one is required to reply to you with the tone or wording that you demand. shut the fuck up you idealist nerd.

                        • zeze@lemm.ee
                          ·
                          1 year ago

                          throughout what? I've replied to you exactly once.

                          First I addressed the behavior of the poster you defended.

                          Second: Why do you think I emphasized the you in the last comment? Where I'm from it would imply your a different person I'm addressing now.

                          With that sorted out: Anyone could, but no one can, because there's no reason for faith, so there's nothing to share. This community takes an idealist take, not a materialist one.

                          I understand what you're saying. Civility doesn't matter because your ideals are solid, but you wouldn't waste the time on defending them. You would waste an equal amount of time writing out immature comments avoiding the point in question though. But that doesn't count, because your being ironic- whereas the coherent comment does count because that's got to take a lot of effort.

                          It's a good excuse for idealists, because they don't look good when they take it seriously. Materialists tend to humor people with civility because they do convince anyone watching.

                          • m532 [she/her]
                            ·
                            1 year ago

                            Materialists tend to humor people with civility because they do convince anyone watching.

                            Wtf? Do you want to claim that materialists fall for scammers?

                            • zeze@lemm.ee
                              ·
                              1 year ago

                              Is that how scams work?

                              You state a belief, a tricky scammer asks you why you believe it, and if you fall for the trick and explain your reasoning then poof you lost money?

                              • UlyssesT
                                ·
                                edit-2
                                22 days ago

                                deleted by creator

                          • UlyssesT
                            ·
                            edit-2
                            22 days ago

                            deleted by creator

                      • UlyssesT
                        ·
                        edit-2
                        22 days ago

                        deleted by creator

                  • UlyssesT
                    ·
                    edit-2
                    22 days ago

                    deleted by creator

          • UlyssesT
            ·
            edit-2
            22 days ago

            deleted by creator