guy recently linked this essay, its old, but i don't think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
guy recently linked this essay, its old, but i don't think its significantly wrong (despite gpt evangelists) also read weizenbaum, libs, for the other side of the coin
Embodied cognition. I don't see this as implying that what we're doing isn't computation (or information processing) in some sense. It's just that the way we're doing it is deeply, deeply different from how even neural networks instantiated on digital computers do it (among other things, our information processing is smeared out across the environment). That doesn't make it not computation in the same way that not having a cover and a mass in grams makes a PDF copy of Moby Dick not a book. There are functional, abstract similarities between PDFs and physical books that make them the same "kinds of things" in certain senses, but very different kinds of things in other senses.
Whether they're going to count as relevantly similar depends on which bundles of features you think are important or worth tracking, which in turn depends on what kinds of predictions you want to make or what you want to do. The fight about whether brains are "really" computers or not obscures the deeply value-laden and perspectival nature of a judgement like that. The danger doesn't lie in adopting the metaphor, but rather in failing to recognize it as a metaphor--or, to put it another way, in uncritically accepting the tech-bro framing of only those features that our brains have in common with digital computers as being things worth tracking, with the rest being "incidental."
I think I agree.
One metaphor I quite like is the brain as a ball of clay. Whenever you do anything the clay is gaining deformities and imprints and picking up impurities from the environment. Embodied cognition, right? Obviously the brain isn't actually a ball of clay but I think the metaphor is useful, and I like it more than I like being compared to a computer. After all, when a calculator computes the answer to a math problem the physical structure of the calculator doesn't change. The brain, though, actually changes! The computation metaphor misses this.
This is really useful for understanding memory, because every time you remember something you pick up that ball of clay and it changes.
What counts as "physical structure?" I can make an adding machine out of wood and steel balls that computes the answer to math problems by shuffling levers and balls around. A digital computer calculates the answer by changing voltages in a complicated set of circuits (and maybe flipping some little magnetic bits of stuff if it has a hard drive). Brains do it by (among other things) changing connections between neurons and the allocation of chemicals. Those are all physical changes. Are they relevantly similar physical changes? Again, that depends deeply on what you think is important enough to be worth tracking and what can be abstracted away, which is a value judgement. One of the Big Lies of tech bro narrative is that science is somehow value free. It isn't. The choice of model, the choice of what to model, and the choice of what predictive projects we think are worth pursuing are all deeply evaluative choices.
In dwarf fortress you can make a computer out of dwarfs, gates, and levers, and it won't change unless the dwarfs go insane from sobriety and start smashing stuff.
Great example! Failure modes are really important. Brains and dwarf fortresses might both be computers, but their different physical instations give them different ways to break down. Sometimes that's not important, but sometimes it's very important indeed. Those are the sorts of things that get obscures by these dogmatic all-or-nothing arguments.
Isn't that what this article is about? That "brain as computer" is a value judgement, just like "brain as hydrolic system" and "brain as telegraph" were? These metaphors are all useful, I think the article was just critiquing the inability for people to think of brains outside of the orthodox computational framework.
I'm just cautioning against taking things too far in the other direction: I genuinely don't think it's right to say "your brain isn't a computer," and I definitely think it's wrong to say that it doesn't process information. It's easy to slide from a critique of the computational theory of mind (either as it's presented academically by people like Pinker or popularly by Silicon Valley) into the opposite--but equally wrong--kind of position that brains are doing something wholly different. They're different in some respects, but there are also very significant similarities. We shouldn't lose sight of either, and it's important to be very careful when talking about this stuff.
Just as an example:
It strikes me as totally wrong to say that this process is free of computation. The computation that's going on here has interesting differences from what goes on in a ball-catching robot powered by a digital computer, but it is computation.
Your analogy reminds me a bit of the Freud essay on the mystical writing pad
That should be a red flag to treat it with caution. Freud was a crank and his only contribution to psychology was being so wrong it inspired generations of scientists to debunk him.