Computational universality has nothing to do with digital computer flipping bits. It just means that any system which manipulates information (performs computation), and can do so at a certain level of complexity (there's lots of equivalent ways of formulating it but the simplest is that it can do integer arithmetic) are exactly equivalent, in that they can all do the same set of computations.
It's pretty obvious that the human brain is at least Turing complete, since we can do integer arithmetic. It's also impossible for any computational system to be "more" than Turing complete (whatever that would even mean) since every single algorithm that can be computed in finite time can be expressed in terms of integer arithmetic, which means that a Turing machine could perform it.
Obviously the human brain is many, many, many layers of abstraction and us FAR more complicated than modern computers. Plus neurons aren't literally performing a bunch of addition and subtraction operations on data, the point is that whatever they are doing logically must be equivalent to some incomprehensibly vast set of simple arithmetic operations that could be performed by a Turing machine, because if the human brain can do a single thing that a general Turing machine can't, then it would either take infinite time or require infinite resources to do so.
Computational universality has nothing to do with digital computer flipping bits. It just means that any system which manipulates information (performs computation), and can do so at a certain level of complexity (there's lots of equivalent ways of formulating it but the simplest is that it can do integer arithmetic) are exactly equivalent, in that they can all do the same set of computations.
It's pretty obvious that the human brain is at least Turing complete, since we can do integer arithmetic. It's also impossible for any computational system to be "more" than Turing complete (whatever that would even mean) since every single algorithm that can be computed in finite time can be expressed in terms of integer arithmetic, which means that a Turing machine could perform it.
Obviously the human brain is many, many, many layers of abstraction and us FAR more complicated than modern computers. Plus neurons aren't literally performing a bunch of addition and subtraction operations on data, the point is that whatever they are doing logically must be equivalent to some incomprehensibly vast set of simple arithmetic operations that could be performed by a Turing machine, because if the human brain can do a single thing that a general Turing machine can't, then it would either take infinite time or require infinite resources to do so.