the-podcast guy recently linked this essay, its old, but i don't think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin

  • DefinitelyNotAPhone [he/him]
    ·
    6 months ago

    Meh, this is basically just someone being Big Mad about the popular choice of metaphor for neurology. Like, yes, the human brain doesn't have RAM or store bits in an array to represent numbers, but one could describe short term memory with that metaphor and be largely correct.

    Biological cognition is poorly understood primarily because the medium it is expressed on is incomprehensibly complex. Mapping out the neurons in a single cubic millimeter of a human brain takes literal petabytes of storage, and that's just a static snapshot. But ultimately it is something that occurs in the material world under the same rules as everything else, and does not have some metaphysical component that somehow makes it impossible to simulate using software in much the same way we'd model a star's life cycle or galaxy formations, just unimaginable using current technology.

    • Formerlyfarman [none/use name]
      ·
      edit-2
      6 months ago

      The op its not arguing it has a metaphisical component. Its arguing the structure of the brain is diferent frome the structure of your pc. The metaphor bit is important because all thinking is metaphor with different levels of rigor and abstraction. A faulty metaphor forces you to think the wrong way.

      I do disagree with some things, whats a metaphor if not a model? Whats reacting to stimuli if not processing information?

      • Frank [he/him, he/him]
        ·
        6 months ago

        The op its not arguing it has a metaphisical component.

        Yes they are. They might scream in your face that they're not, but the argument they're making is based not on science and observation but rather the chains of a christian culture they do not feel and cannot see.

        A faulty metaphor forces you to think the wrong way.

        The Sapir-Whorf hypothesis, if it's accurate at all, does not have a strong effect.

        whats a metaphor if not a model?

        To quote the dictionary; "a figure of speech in which a word or phrase is applied to an object or action to which it is not literally applicable." Which seems to be the real problem, here; Psychologists and philosophers hear someone using a metaphor and think they must literally believe what the psychologist or philosopher believes about the symbol being used.

        • Formerlyfarman [none/use name]
          ·
          6 months ago

          I think you are rigth. Our dissagrement comes from thinking the metaphor refers to structure rather than just language. Lets say an atomic model were the electrons ar flying around a nucleus formimg shells, is also not literaly aplicable. But we think of it as a useful metaphore because its close enough.

          The same should apply to the most sophisticated mathematical models. A useful metaphor should then be a more primitive form of thise process where it illustrates a mechanism. If the mechanism is different from the mechanism in the metaphor then it should be wrong.

          If the metaphor is just there to provide names, then you are offcourse rigth that it should not change anything.

          Whether the metaphor of computers and brains is correct or not should also have no effect on wether we can simulate a brain in a computer. Computers can after all simulate many things that do not work like computers.

    • SerLava [he/him]
      ·
      6 months ago

      Mapping out the neurons in a single cubic millimeter of a human brain takes literal petabytes of storage, and that's just a static snapshot

      I've read long ago that replicating all the functions of a human brain is probably possible with computers around one order of magnitude less powerful than the brain because it's kind of inefficient

      • bumpusoot [any]
        ·
        edit-2
        6 months ago

        There's no way we can know that, currently. The brain does work in all sorts of ways we really don't understand. Much like the history of understanding DNA, what gets written off as "random inefficiency" is almost certainly a fundamental part of how it works.

      • dat_math [they/them]
        ·
        6 months ago

        because it's kind of inefficient

        relative to what and in what sense do you mean this?

        • SerLava [he/him]
          ·
          6 months ago

          I mean for the most extreme example, it takes approximately 1 bazillion operations to solve 1+1

            • SerLava [he/him]
              ·
              edit-2
              6 months ago

              No I mean the human brain does that, and adding 1 and 1 can be done with like a few wires, or technically two rocks if you wanna be silly about it

              Show
              This thing adds 1 to any number from 0 to 15 and it's tremendously less complex than a neuron, it's like 50 pieces of metal or whatever

                • SerLava [he/him]
                  ·
                  6 months ago

                  The human brain does that many operations to add 1 and 1

                    • SerLava [he/him]
                      ·
                      edit-2
                      6 months ago

                      Don't you think imagining 1, imagining another 1, briefly remembering the concept of addition, thinking the word "plus", remembering that 1+1=2 is a true thing you know... that involves quite a few neurons firing right? And each neuron is unimaginably more complex than a piece of digital hardware that adds 1 and 1, which again is like 40 or 50 pieces of metal and plastic

                      • dat_math [they/them]
                        ·
                        edit-2
                        6 months ago

                        that involves quite a few neurons firing right

                        I think it's far fewer than you think. Here's my logic:

                        First, let's make this a fair comparison. If we're comparing neural computation of 1+1 to a 4 bit adder, then our signals are already downstream of the networks that do the initial perceptual processing and convert those into neural correlates of the number 1 and the operation addition. If they aren't, then we need to somehow compare power consumed for data input to the 4 bit adder and generally make the problem much more difficult.

                        My quick hypothesis is that for people who have practiced the arithmetic enough to not need to work it out manually, the recall is accomplished with fewer than 10,000 neurons spiking (and orders of magnitude more neurons not spiking, which consumes a negligible amount of energy). Even if this many neurons are involved, they're not all constantly spiking or the neural network is having a seizure. Typically only a small fraction of neurons that are involved in a task are spiking at high rate simultaneously, but even if it's all 10,000 spiking constantly for the duration of the calculation, a typical human brain consumes about 20 watts and has 10^11 neurons total, so our computation requires only 20 microwatts.

                        Now, if we were to make the comparison truly level, we'd discard all the memory circuitry and we'd compare the power consumption of the 4 bit adder to a reasonably sized biological neural network that has learned to accomplish only the task of 4 bit addition.

                        I'm trying not to get nerd sniped here, but if you're curious and handy with python, you could put together an experiment to see how many leaky integrate and fire neurons you need to do 4 bit addition with this library

                        This group found a way to do 16 bit addition with a 4-neuron spiking neural network in silico. If indeed four LIF neurons are all you need to do 16 bit addition, then the calculation can probably be accomplished with only 8E-10 watts (compared to an 8 gate circuit that likely consumes (when switching) on the order of 8 nanoWatts, more than 10x our estimate for the biological neural network)

                        and that's why I hypothesize the computational efficiency ie bits computed per watt expended is higher for biological neural networks doing addition than silicon doing the same thing. Thanks for coming to my a-guy talk

                        • SerLava [he/him]
                          ·
                          edit-2
                          6 months ago

                          Oh I had no idea we were talking about electrical energy efficiency, I meant complexity. I was saying they could make a computer less computationally powerful and have it simulate the input and output of neurons without having as many parts or as many signals/operations as each neuron

                          • dat_math [they/them]
                            ·
                            6 months ago

                            oh I agree re: the differences in complexity

                            I obviously thought we were talking about energy efficiency lol

      • Formerlyfarman [none/use name]
        ·
        6 months ago

        You can probably do it with a pentium processor if you know how. The brain is very slow, and pentium processors are amazingly fast. Its jut that we have no idea.

      • FunkyStuff [he/him]
        ·
        6 months ago

        Resident dumb guy chipping in, but are these two facts mutually exclusive? Assuming both are true, it just means you'd need a computer that's 1e12x as powerful as our supercomputers to simulate the brain, which is itself 1e13x as powerful as a supercomputer. So we're still not getting there anytime soon.

        *With a very loose meaning of what "powerful" means seeing as the way the brain works is completely different to a computer that calculates in discrete steps.

    • plinky [he/him]
      hexagon
      ·
      6 months ago

      I could describe it as gold hunter with those sluice thingies, throwing water out and keeping gold, there I described short term memory.

      shrug-outta-hecks

      I don’t disagree it’s a material process, I just think we find most complex analogy we have at the time and take it (as author mentions), but then start taking metaphor too far

      • Frank [he/him, he/him]
        ·
        6 months ago

        Yeah but we, if "we" is people who have a basic understanding of neuroscience, aren't taking it to far. The author is yelling at a straw man, or at lay people which is equally pointless. Neuroscientists don't think of the mind or the brain it runs on as being a literal digital computer. They have their own completely incomprehensible jargon for discussing the brain and the mind, and if this article is taken at face value the author either doesn't know that or is talking to someone other than people who do actual cognitive research.

        I'ma be honest, i think there might be some academic infighting here. Psychology is a field with little meanginful rigor and poor explanatory power, while neuroscience is on much firmer ground and has largely upended the theories arising from Epstein's heyday. I think he might be feeling the icy hand of mortality in his chest and is upset the world has moved past him and his ideas.

        Also, the gold miner isn't a good metaphor. In that metaphor information only goes one way and is sifted out of chaos. There's no place in the metaphor for a process of encoding, retrieving, or modifying information. It does not resemble the action of the mind and cannot be used as a rough and ready metaphor for discussing the mind.

        • Sidereal223 [he/him]
          ·
          6 months ago

          I work in neuroscience and I don't agree that it is on much firmer ground that psychology. In fact, as some people in the community have noted, the neuroscience mainstream is probably still in the pre-paradigmitic stage (using Kuhn). And believe it or not, a lot of neuroscientists naively do believe that the brain is like a computer (maybe not exactly one, but very close).

        • plinky [he/him]
          hexagon
          ·
          edit-2
          6 months ago

          Sure there is: encoding is taking sand from the river (taking noise from the world into comprehensible inputs) storage is taking the gold, modifying is throwing some bits out or taking them to the smith.

          From the bottom up (and in the middle, if we take partial electro, ultrasound or magnetic stimulation) neuroscience andvances are significant but rather vague. We likely know how on molecular level memory works, but that has jack shit to do with information pipelines, but rather rigorous experiments, or in case of machine human interface more like skilled interpretation of what you see and knowing where to look for it (you can ascribe it to top down approach).

          Neuroscientists likely dont, but I think you have rather nicer opinion of tech bros than I do or their ideas among people

          • Frank [he/him, he/him]
            ·
            6 months ago

            My opinion of tech bros is that anyone deserving the label "tech bro" is a dangerous twit who should be under the full time supervision of someone with humanities training, a gun, and orders to use it if the tech bro starts showing signs of independent thought. It's a thoroughly pathological world view, a band of lethally competent illiterates who think they hold all human knowledge and wisdom. If this is all directed at tech bros I likely didn't realize it because I consider trying to teach nuance to tech bros about as useful as trying to teach it to a dog and didn't consider someone in an academic field would want to address them.