I must confess I have a personal vendetta against Yudkowsky and his cult. I studied computer science in college. As an undergrad, I worked as an AI research assistant. I develop software for a living. This is my garden the LessWrong crowd is trampling.

  • Nachorella@lemmy.sdf.org
    ·
    1 year ago

    I absolutely hate people throwing around the word 'sentient' like they know what it means. If you just read any dictionary it's pretty obviously not some arbitrary word you can use to say it's ok to kill animals.

  • CascadeOfLight [he/him]
    ·
    1 year ago

    I think Eliezer Yudkowsky could probably ask in words not to be killed, but that doesn't mean he contains a spark of awareness like a chicken does.

    • BeamBrain [he/him]
      hexagon
      M
      ·
      1 year ago

      Do they think that people without the ability to communicate are not sapient?

      I can fully believe Yud and his followers are ableist as hell.

  • barrbaric [he/him]
    ·
    1 year ago

    Love when my "AI expert" has no education on the subject and just makes up random thought experiments with no basis in reality.

    And this dipshit is somehow a millionaire.

    • UlyssesT [he/him]
      ·
      1 year ago

      And this dipshit is somehow a millionaire.

      He sucks up to billionaires so billionaires sprinkle money on him.

    • privatized_sun [none/use name]
      ·
      1 year ago

      And this dipshit is somehow a millionaire.

      the most PMC neoliberal out of the entire class of IBM computer holocaust technicians :epsteinepstein

  • Goadstool
    ·
    edit-2
    21 days ago

    deleted by creator

  • Yurt_Owl
    ·
    1 year ago

    If input contains kill Print "no kill pls"

    Zam my comment is sentient

  • Parsani [love/loves, comrade/them]
    ·
    edit-2
    1 year ago

    The hard problem of consciousness? What's that? Also, light switches definitely have feelings, and if you have a lot of light switches you've basically got a hive mind. Turning one light switch off is violence, and turning them all off is genocide.

    Edit: Seeing what comm this is, I'd like to make this clear. I will not kill the chicken, but I'd definitely enact genocide on light switches.

    :luddite-smash:

    • UlyssesT [he/him]
      ·
      1 year ago

      That was either said earnestly or sarcastically or some smug vague mixture of both and across that entire spectrum there is only "autodidactic" euphoria. smuglord

  • WhyEssEff [she/her]
    ·
    edit-2
    1 year ago

    yud-rational the really big switch statement has more likelihood of having sentience than the living breathing entities that experience the earth with us

    • UlyssesT [he/him]
      ·
      1 year ago

      "Bayesian" Silicon Valley cultists very truly have this "prior" at the base of all their "priors" from which they derive first-principle probabilities for all of their beliefs:

      Show

    • SpiderFarmer [he/him]
      ·
      1 year ago

      Seeing shit like Peterson and Shapiro, it really was better when right-wingers dismissed philosophy the same way they dismissed sociology and basic decency.

  • laziestflagellant [they/them]
    ·
    1 year ago

    He is so fucking stupid.

    Actually you know what? I hope he keeps talking. I hope he keeps talking about a wider and wider variety of subjects because that increases the odds of any person seeing him talk and going 'Wait a minute, I think this guy doesn't know what he's talking about!', which is the reaction everyone should have

    • UlyssesT [he/him]
      ·
      1 year ago

      I would hope for that too, but unfortunately he's still running a profitable cult that regularly creeps on and drugs impressionable women to feed his slavery fetish.

      my-hero tier "you are a figment of my imagination and exist only to serve me" mantra applications and all. JB-shining-aggro

      • alcoholicorn [comrade/them, doe/deer]
        ·
        edit-2
        1 year ago

        I assumed Elon was just incapable of empathy, but after he tweeted "If you don't think there's a tiny chance you're an NPC, then you probably are", I wouldn't be shocked if his brain was a biological version of chatGPT.

        Like their own sentience is the one thing every sentient being is certain of.

        • UlyssesT [he/him]
          ·
          1 year ago

          He's deeply high on his own occult supply, but I won't deny sentience to my-hero even if he bloviates in a meandering way intended to deny it toward others.

  • GarbageShoot [he/him]
    ·
    1 year ago

    The normal refutation to this is that the LLM is not "telling" you anything, it is producing an output of characters that, according to its training on its data set, look like a plausible response to the given prompt. This is like using standard conditioning methods to teach a gorilla to make gestures corresponding to "please kill me now" in sign language and using that to justify killing it. They are not "communicating" with the symbols in the sense of the semantic meanings humans have assigned the symbols, because the context they have observed the symbols in and used them for is utterly divorced from those arbitrary meanings.

    • UlyssesT [he/him]
      ·
      1 year ago

      Chinese Room thought experiment don't real. All that exists is Harry Potter fanfiction with sex slavery characteristics. Try being less wrong. smuglord

    • Mardoniush [she/her]
      ·
      edit-2
      1 year ago

      More formally, there was sentience involved, but it was upstream when the data set was produced and curated in the first place. Which is why LLMs have that warmed over photocopy feel.

  • UlyssesT [he/him]
    ·
    edit-2
    1 year ago

    Big Yud is a slavery-fetishizing woman-enslaving predatory monster.

    When he isn't dehumanizing human beings to rhetorically uplift his always-around-the-corner god-machines, he goes out of his way to place living beings in hierarchies of worthiness where he's at the very top and everything beneath him exists for his consumption, amusement, or both.

    Also, I utterly despise the belief that by denigrating living beings, the implication uplifts the treat-printing machines. I see it everywhere, including on Hexbear sometimes, and I fucking hate it.

    This thread, for example, had a lot of examples, many of them now deleted, of "humans are just meat computers" reductionistic bullshit while the same posts elevated chatbots and LLMs into "at least as conscious, if consciousness exists at all" territory.

    https://hexbear.net/post/241191

    • UmbraVivi [he/him, she/her]
      ·
      1 year ago

      From what I've seen from Yudkowsky, what stands out to me is his ignorance and lack of curiosity about how anything actually works.

      If God is our way to explain questions we have no answer to, then you can truly find God in any machine as long as you maintain your own ignorance, as long as you choose not to lift the curtain. You can make claims that chickens have no sentience or that ChatGPT does, you can make claims that humans are meat computers as long as you don't actually care about the true inner workings of anything, as long as you don't interrogate what any of that actually means.

      Yudkowsky's beliefs truly sound like faith, pure religious faith. The less you know, the more you can believe.

      • UlyssesT [he/him]
        ·
        1 year ago

        From what I've seen from Yudkowsky, what stands out to me is his ignorance and lack of curiosity about how anything actually works.

        He's a self-described "autodidact" that claims that actual academics and for that matter experts in actual AI-related research fields are inferior to his autodidactical genius because they don't take him seriously.

        You can make claims that chickens have no sentience or that ChatGPT does, you can make claims that humans are meat computers as long as you don't actually care about the true inner workings of anything, as long as you don't interrogate what any of that actually means.

        Crude reductionism is an easy default take for someone too incurious and arrogant to even consider that something might be too complex for current understanding to fully grasp.

        Yudkowsky's beliefs truly sound like faith, pure religious faith. The less you know, the more you can believe.

        He runs a cult, period. It has all the traits of a cult, from a chosen elect to the damned outsiders to the prophecies of salvation and doom and of course sex predator antics.

    • BeamBrain [he/him]
      hexagon
      M
      ·
      edit-2
      1 year ago

      he goes out of his way to place living beings in hierarchies of worthiness where he's at the very top and everything beneath him exists for his consumption, amusement, or both.

      Holy shit, this just made something click for me. I finally have a succinct way to describe why Big Yud is so awful. The infantilism, the selfishness, the need for control, the ego, the overestimation of his own abilities. Even the desire to be immortal.

      He's a real life Porky Minch.

      • UlyssesT [he/him]
        ·
        1 year ago

        Considering how he consumed and interpreted fucking Harry Potter and then vomited out "the Methods of Rationality" which was a magical realm of his personal sex slavery fetishes, I can only imagine the horror that would be his interpretation of the Mother/Earthbound series. yea

        • BeamBrain [he/him]
          hexagon
          M
          ·
          edit-2
          1 year ago

          I can only imagine the horror that would be his interpretation of the Mother/Earthbound series.

          "Porky is the good guy who introduced LE SCIENCE AND RATIONALITY to the primitive Tazmily, Lucas and co. stopped him from bringing about the singularity and saving the world"

          • UlyssesT [he/him]
            ·
            1 year ago

            You cracked the code. That's how deep it would go: "bad guy is good actually because immortality and superpowers." morshupls

      • Mardoniush [she/her]
        ·
        1 year ago

        Also he's getting dumber. His ideas are getting worse with every iteration and he was once a smart but annoying kid who's turnkey parents didn't keep him off usenet before too much damage was done.

        • UlyssesT [he/him]
          ·
          1 year ago

          He's the shittiest kind of doomer these days, too: he's lamented that the world is doomed (not to climate change or environmental collapse, but techbro "sexy robot goddess decides to kill the gross nerds that financed her construction" doomsday scenarios) and that he's to blame because there's not enough of him to solve the problem. what-the-hell