I don’t know how there aren’t a myriad of problems associated with attempting to emulate the brain, especially with the end goal of destroying livelihoods and replacing one indentured servant for another. In fact, that’s what promoted this post- an advertisement for a talk with my alma mater’s philosophy department asking what happens when see LLMs discover phenomenological awareness.

I admit that I don’t have a ton of formal experience with philosophy, but I took one course in college that will forever be etched into my brain. Essentially, my professor explained to us the concept of a neural network and how with more computing power, researchers hope to emulate the brain and establish a consciousness baseline with which to compare a human’s subjective experience.

This didn’t use to be the case, but in a particular sector, most people’s jobs are just showing up a work, getting on a computer, and having whatever (completely unregulated and resource devouring) LLM give them answer they can find themselves, quicker. And shit like neuralink exists and I think the next step will to be to offer that with a chatgpt integration or some dystopian shit.

Call me crazy, but I don’t think humans are as special as we think we are and our pure arrogance wouldn’t stop us from creating another self and causing that self to suffer. Hell, we collectively decided to slaughter en masse another collective group with feeling (animals) to appease our tastebuds, a lot of us are thoroughly entrenched into our digital boxes because opting out will result in a loss of items we take for granted, and any discussions on these topics are taboo.

Data-obsessed weirdos are a genuine threat to humanity, consciousness-emulation never should have been a conversation piece in the first place without first understanding its downstream implications. Feeling like a certified Luddite these days

  • BoxedFenders [any, comrade/them]
    ·
    1 month ago

    Human consciousness is an emergent property of neurons firing in our brain. Unless you attribute consciousness to some external mystical force, replicating it should theoretically be possible. I'm not saying LLMs are the path to get there or that we are anywhere close to it, but it seems inevitable that it is eventually achieved.

    • GaveUp [she/her]
      ·
      1 month ago

      All the math done to estimate the computation required shows absurd numbers required at minimum

      Capitalists will never try to truly emulate a human brain because it's infinitely cheaper to just hire/breed/enslave real ones to do whatever you need

      • Saeculum [he/him, comrade/them]
        ·
        1 month ago

        All the math done to estimate the computation required shows absurd numbers required at minimum

        Nature fit it into a space the size of a human head with a bunch of redundancy through an unconscious process of trial and error.

    • imogen_underscore [it/its, she/her]
      ·
      edit-2
      1 month ago

      i personally do believe in the human soul and don't think rationalist vulgar materialism can fully explain consciousness so yeah, I guess we may just fundamentally disagree there. it doesn't even have to be something "mystical" though, could just be something totally unknown to science that can never be replicated in silicon. even if you still think it's possible, it's plain that the current extinction event and the technological setbacks/energy crises it will bring is going to prevent much progress being made towards the currently science fiction-level technology and energy required to get even close. far from "inevitable" in my view and ultimately, a total waste of time and resources. may as well say Dyson spheres another thing made up by SF writers are inevitable. energy crises, tech setbacks and population destruction will always get in the way. it's utopian to a cartoonish extent, like hundreds or thousands of years of end stage communism would be needed for this kind of stuff to even begin being feasible. and if we had that then I would hope creating AI slaves wouldn't be very high on the agenda. that's why I think taking it seriously is a waste of time.

      • BoxedFenders [any, comrade/them]
        ·
        1 month ago

        even if you still think it's possible, it's plain that the current extinction event and the technological setbacks/energy crises it will bring is going to prevent much progress being made towards the currently science fiction-level technology and energy required to get even close.

        No disagreement from me on this point. And by no means did I mean its inevitability will happen in our lifetime or even centuries from now. Just that it is theoretically possible and there is no physical limitation that forbids it, unlike say, faster than light space travel. But since you believe in human souls I'm curious- would you ever concede that a sufficiently advanced machine could be conscious or dismiss it as a trickery of code? Does consciousness only arise when a soul is assigned to an organism with 46 chromosomes?

        • imogen_underscore [it/its, she/her]
          ·
          edit-2
          1 month ago

          I'm not really interested in any kind of debate about this sorry. you're not being rude or anything I just find the idea tedious, as I said it's clear we just disagree on a basic thing here and I'm fine with keeping it that way

      • Saeculum [he/him, comrade/them]
        ·
        1 month ago

        could just be something totally unknown to science

        Any thoughts on brain organoid computers related to this?