the-podcast guy recently linked this essay, its old, but i don't think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin

  • SerLava [he/him]
    ·
    6 months ago

    Mapping out the neurons in a single cubic millimeter of a human brain takes literal petabytes of storage, and that's just a static snapshot

    I've read long ago that replicating all the functions of a human brain is probably possible with computers around one order of magnitude less powerful than the brain because it's kind of inefficient

    • bumpusoot [any]
      ·
      edit-2
      6 months ago

      There's no way we can know that, currently. The brain does work in all sorts of ways we really don't understand. Much like the history of understanding DNA, what gets written off as "random inefficiency" is almost certainly a fundamental part of how it works.

    • dat_math [they/them]
      ·
      6 months ago

      because it's kind of inefficient

      relative to what and in what sense do you mean this?

      • SerLava [he/him]
        ·
        6 months ago

        I mean for the most extreme example, it takes approximately 1 bazillion operations to solve 1+1

          • SerLava [he/him]
            ·
            edit-2
            6 months ago

            No I mean the human brain does that, and adding 1 and 1 can be done with like a few wires, or technically two rocks if you wanna be silly about it

            Show
            This thing adds 1 to any number from 0 to 15 and it's tremendously less complex than a neuron, it's like 50 pieces of metal or whatever

              • SerLava [he/him]
                ·
                6 months ago

                The human brain does that many operations to add 1 and 1

                  • SerLava [he/him]
                    ·
                    edit-2
                    6 months ago

                    Don't you think imagining 1, imagining another 1, briefly remembering the concept of addition, thinking the word "plus", remembering that 1+1=2 is a true thing you know... that involves quite a few neurons firing right? And each neuron is unimaginably more complex than a piece of digital hardware that adds 1 and 1, which again is like 40 or 50 pieces of metal and plastic

                    • dat_math [they/them]
                      ·
                      edit-2
                      6 months ago

                      that involves quite a few neurons firing right

                      I think it's far fewer than you think. Here's my logic:

                      First, let's make this a fair comparison. If we're comparing neural computation of 1+1 to a 4 bit adder, then our signals are already downstream of the networks that do the initial perceptual processing and convert those into neural correlates of the number 1 and the operation addition. If they aren't, then we need to somehow compare power consumed for data input to the 4 bit adder and generally make the problem much more difficult.

                      My quick hypothesis is that for people who have practiced the arithmetic enough to not need to work it out manually, the recall is accomplished with fewer than 10,000 neurons spiking (and orders of magnitude more neurons not spiking, which consumes a negligible amount of energy). Even if this many neurons are involved, they're not all constantly spiking or the neural network is having a seizure. Typically only a small fraction of neurons that are involved in a task are spiking at high rate simultaneously, but even if it's all 10,000 spiking constantly for the duration of the calculation, a typical human brain consumes about 20 watts and has 10^11 neurons total, so our computation requires only 20 microwatts.

                      Now, if we were to make the comparison truly level, we'd discard all the memory circuitry and we'd compare the power consumption of the 4 bit adder to a reasonably sized biological neural network that has learned to accomplish only the task of 4 bit addition.

                      I'm trying not to get nerd sniped here, but if you're curious and handy with python, you could put together an experiment to see how many leaky integrate and fire neurons you need to do 4 bit addition with this library

                      This group found a way to do 16 bit addition with a 4-neuron spiking neural network in silico. If indeed four LIF neurons are all you need to do 16 bit addition, then the calculation can probably be accomplished with only 8E-10 watts (compared to an 8 gate circuit that likely consumes (when switching) on the order of 8 nanoWatts, more than 10x our estimate for the biological neural network)

                      and that's why I hypothesize the computational efficiency ie bits computed per watt expended is higher for biological neural networks doing addition than silicon doing the same thing. Thanks for coming to my a-guy talk

                      • SerLava [he/him]
                        ·
                        edit-2
                        6 months ago

                        Oh I had no idea we were talking about electrical energy efficiency, I meant complexity. I was saying they could make a computer less computationally powerful and have it simulate the input and output of neurons without having as many parts or as many signals/operations as each neuron

                        • dat_math [they/them]
                          ·
                          6 months ago

                          oh I agree re: the differences in complexity

                          I obviously thought we were talking about energy efficiency lol

    • Formerlyfarman [none/use name]
      ·
      6 months ago

      You can probably do it with a pentium processor if you know how. The brain is very slow, and pentium processors are amazingly fast. Its jut that we have no idea.

    • FunkyStuff [he/him]
      ·
      6 months ago

      Resident dumb guy chipping in, but are these two facts mutually exclusive? Assuming both are true, it just means you'd need a computer that's 1e12x as powerful as our supercomputers to simulate the brain, which is itself 1e13x as powerful as a supercomputer. So we're still not getting there anytime soon.

      *With a very loose meaning of what "powerful" means seeing as the way the brain works is completely different to a computer that calculates in discrete steps.