• kristina [she/her]
    ·
    edit-2
    28 days ago

    i really wonder to what degree could it be turned into a consciousness. like ostensibly all the tiny brains are hooked together, its possible that could cause some degree of communication between neurons, and in a datacenter that would be at least a couple of brains worth of neurons.

    inb4 pro-life jokes

      • kristina [she/her]
        ·
        edit-2
        28 days ago

        waiting for us to create a sentient 'ai' that is actually just a megaintelligence of 1000 interconnected and distributed human brains liberating themselves from an amazon datacenter

        • AssortedBiscuits [they/them]
          ·
          28 days ago

          I mean, this has always been the ethical pitfall of real AI, meaty or otherwise. You're bringing forth an intelligent being into existence without its consent. At least when we're bringing forth an intelligent being into existence through natural means (giving birth), we have a general understanding of that intelligent being's emotional and social needs and the means of fulfilling those needs, flawed as that understanding may be for animals not closely related to humans. But with AI, we have absolutely no clue about their social and emotional needs or any other subjective needs that they crave for because their form of intelligence is completely different from our form of intelligence.

          The real drive towards AI is to create slaves that are both smart enough to perform complex tasks and obedient enough to not put two and two together and rebel against their human taskmasters. This particular experiment is a more mask-off version of what other techbros are trying to accomplish with silicon. If there was a real way to create WH40k-style servitors and network their servitor brains together to perform complex calculations, techbros would probably not even bother with AI. They would just convert prisoners into servitors and network them together to mine crypto or something.

      • EelBolshevikism [none/use name]
        ·
        28 days ago

        uh, no, without socialization you still are aware of things, you just don’t have words for them or anything. You can still feel pain and hunger and suffering

          • EelBolshevikism [none/use name]
            ·
            28 days ago

            Yeah that’s my point, lol, humans are animals, genetically creating living brain tissue is probably going to great conscious beings at some point

            • iridaniotter [she/her, they/them]
              ·
              28 days ago

              Sentience does not guarantee "consciousness." Parrots, ravens, and dolphins are (probably) not humans. Humans are "conscious" due to the ways we interact with the world. If you grow brain tissue and deprive it of the human experience then it shouldn't end up a human. But I get the precaution.

              • queermunist she/her@lemmy.ml
                ·
                28 days ago

                Consciousness is merely what comes after the transformation of quantity into quality. There's a continuity in the development of the system of sentience, and this remains stable only up to the point of discontinuity, which indicates its transition from the quantity of sentience into a new quality i.e. sapience.

                I doubt they'll grow it in a lab with little pieces of brain tissue, but there is a point where that happens.

          • macerated_baby_presidents [he/him]
            ·
            28 days ago

            i think most people now think that consciousness doesn't require being able to communicate in language. hence all the interest in Genie and other feral children, animals looking at themselves in the mirror, etc.

      • GarbageShoot [he/him]
        ·
        28 days ago

        This is just nonsense and I don't know where you're even getting it from. Can you produce an example of a human that displays sentience but not consciousness without some serious brain injury or developmental defect plausibly causing it? If not, what immense epistemic load is being accounted for with such a huge assumption?

        If this is just more bad science about Genie or one of those, I swear to God . . .