Literally just mainlining marketing material straight into whatever’s left of their rotting brains.

  • CannotSleep420@lemmygrad.ml
    ·
    7 months ago

    One doesn't need to assert the existence of an immaterial soul to point out that the mechanisms that lead to consciousness are different enough from the mechanisms that make computers work that the former can't just be reduced to an ultra complex form of the latter.

    • oktherebuddy
      ·
      7 months ago

      There isn't a materialist theory of consciousness that doesn't look something like an ultra complex computer. We're talking like an alternative explanation exists but it really does not.

      • CannotSleep420@lemmygrad.ml
        ·
        7 months ago

        In what way does consciousness resemble an ultra complex computer? Nobody has consciousness fully figured out of course, but I would at least expect there to be some relevant parallel between computer hardware and brain hardware if this is the case.

        • oktherebuddy
          ·
          edit-2
          7 months ago

          When people say computer here they mean computation as computer scientists conceive of it. Abstract mathematical operations that can be modeled by boolean circuits or Turing machines, and embodied in physical processes. Computers in the sense you're talking about (computer hardware) are one method of embodying these operations.

          • CannotSleep420@lemmygrad.ml
            ·
            7 months ago

            I probably should have worded my last reply differently, because modeling the human brain with boolean circuits and turing machines is mainly what I have an issue with. While I'm not particularly knowledgable on the brain side of things, I can see the resemblance between neurons and logic gates. However, my contention is that the material constraints of how those processes are embodied are going to have a significant effect on how the system works (not to say that you were erasing this effect entirely).

            I want to say more on the topic, but now that my mind is on it I want to put some time and effort into explaining my thoughts in its own post. I'll @ you in a reply if/when I make the post.

            • Saeculum [he/him, comrade/them]
              ·
              7 months ago

              However, my contention is that the material constraints of how those processes are embodied are going to have a significant effect on how the system works

              Sure, but that's no basis to think that a group of logic gates could not eventually be made to emulate a neuron. The neuron has a finite number of things it can do because of the same material constraints, and while one would probably end up larger than the other, increasing the physical distances between the thinking parts, that would surely only limit the speed of an emulated thought rather than its substance?

          • silent_water [she/her]
            ·
            7 months ago

            it still remains to be proved that consciousness can be emulated on a Turing machine. that's a huge open problem. you can assume it's true but your results are contingent.

        • drhead [he/him]
          ·
          7 months ago

          What stops me from doing the same thing that neurons do with a sufficiently sized hunk of silicon? Assuming that some amount of abstraction is fine.

          If the answer is "nothing", then that demonstrates the point. If you can build an artificial brain, that does all of the things a brain does, then there is nothing special about our brains.

          • Egon [they/them]
            ·
            edit-2
            7 months ago

            But can you actually build an artificial brain with a hunk of silicon? We don't know enough about brains or consciousness to do that, so the point is kinda moot

      • WideningGyro [any]
        ·
        7 months ago

        I zoned out on the consciousness debate around 2015, so forgive me if this stuff is now considered outdated, but as I recall those materialist theories of consciousness all run into the hard problem, right? I might be biased in one direction, but I feel like the fact that computational models can't account for lived experience is a pretty good argument against them. Wouldn't it just be more accurate to say that we're missing a good theory of consciousness, at all?