the-podcast guy recently linked this essay, its old, but i don't think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin

  • SerLava [he/him]
    hexbear
    7
    1 month ago

    Mapping out the neurons in a single cubic millimeter of a human brain takes literal petabytes of storage, and that's just a static snapshot

    I've read long ago that replicating all the functions of a human brain is probably possible with computers around one order of magnitude less powerful than the brain because it's kind of inefficient

    link
    fedilink
    • bumpusoot [none/use name]
      hexbear
      7
      edit-2
      1 month ago

      There's no way we can know that, currently. The brain does work in all sorts of ways we really don't understand. Much like the history of understanding DNA, what gets written off as "random inefficiency" is almost certainly a fundamental part of how it works.

      link
      fedilink
    • dat_math [they/them]
      hexbear
      4
      1 month ago

      because it's kind of inefficient

      relative to what and in what sense do you mean this?

      link
      fedilink
      • SerLava [he/him]
        hexbear
        2
        1 month ago

        I mean for the most extreme example, it takes approximately 1 bazillion operations to solve 1+1

        link
        fedilink
          • SerLava [he/him]
            hexbear
            2
            edit-2
            1 month ago

            No I mean the human brain does that, and adding 1 and 1 can be done with like a few wires, or technically two rocks if you wanna be silly about it

            Show
            This thing adds 1 to any number from 0 to 15 and it's tremendously less complex than a neuron, it's like 50 pieces of metal or whatever

            link
            fedilink
              • SerLava [he/him]
                hexbear
                2
                1 month ago

                The human brain does that many operations to add 1 and 1

                link
                fedilink
                  • SerLava [he/him]
                    hexbear
                    2
                    edit-2
                    1 month ago

                    Don't you think imagining 1, imagining another 1, briefly remembering the concept of addition, thinking the word "plus", remembering that 1+1=2 is a true thing you know... that involves quite a few neurons firing right? And each neuron is unimaginably more complex than a piece of digital hardware that adds 1 and 1, which again is like 40 or 50 pieces of metal and plastic

                    link
                    fedilink
    • Formerlyfarman [none/use name]
      hexbear
      2
      1 month ago

      You can probably do it with a pentium processor if you know how. The brain is very slow, and pentium processors are amazingly fast. Its jut that we have no idea.

      link
      fedilink
    • FunkyStuff [he/him]
      hexbear
      1
      1 month ago

      Resident dumb guy chipping in, but are these two facts mutually exclusive? Assuming both are true, it just means you'd need a computer that's 1e12x as powerful as our supercomputers to simulate the brain, which is itself 1e13x as powerful as a supercomputer. So we're still not getting there anytime soon.

      *With a very loose meaning of what "powerful" means seeing as the way the brain works is completely different to a computer that calculates in discrete steps.

      link
      fedilink