The heatmap on the right in the image shows the error. It gets progressively worse as the numbers get larger. Notably, also, the error is not symmetric in the operands, so the model is not aware that addition is commutative. Even after 2^128 or so training examples (it seems the training set is every pair of unsigned 64-bit integers) it couldn't figure out that a+b = b+a
TBH I wouldn't expect a ML algorithm to "figure out" that addition is commutative, even a good one with acceptable errors (unlike this one); it's a big logic leap that it is not really suited to get by itself (ofc this just means it is a silly way to try to do addition on a computer)
fwiw, commutativity didn't really get specifically called out by mathematicians until they adopted some kind of symbolic representation (which happened at vastly different times in different places). without algebra, there's not much reason to spell it out, even if you happen to notice the pattern, and it's even harder to prove it. (actually... it's absurdly hard to prove even with it - see the Principia Mathematica...)
these algorithms are clearly not reasoning but this isn't an example. yes, it seems obvious and simple now but it short changes how huge of a shift the switch to symbolic reasoning is in the first place. and that's setting aside whether notions like "memory" and "attention" are things these algorithms can actually do (don't get me started on how obtuse the literature is on this point).
Wait, what does "near perfectly" mean? It sometimes just fucks up and arrives at the wrong answer?
The heatmap on the right in the image shows the error. It gets progressively worse as the numbers get larger. Notably, also, the error is not symmetric in the operands, so the model is not aware that addition is commutative. Even after 2^128 or so training examples (it seems the training set is every pair of unsigned 64-bit integers) it couldn't figure out that a+b = b+a
aaaaaaaaaaaa
TBH I wouldn't expect a ML algorithm to "figure out" that addition is commutative, even a good one with acceptable errors (unlike this one); it's a big logic leap that it is not really suited to get by itself (ofc this just means it is a silly way to try to do addition on a computer)
Neither would I, I guess I'm more pointing out the dustinction between prediction accuracy and understanding.
fwiw, commutativity didn't really get specifically called out by mathematicians until they adopted some kind of symbolic representation (which happened at vastly different times in different places). without algebra, there's not much reason to spell it out, even if you happen to notice the pattern, and it's even harder to prove it. (actually... it's absurdly hard to prove even with it - see the Principia Mathematica...)
these algorithms are clearly not reasoning but this isn't an example. yes, it seems obvious and simple now but it short changes how huge of a shift the switch to symbolic reasoning is in the first place. and that's setting aside whether notions like "memory" and "attention" are things these algorithms can actually do (don't get me started on how obtuse the literature is on this point).
To save money - Boeing is already using it?
Computers often do. Otherwise there wouldnt be crashes.
You clearly haven’t seen my code
I think the alu messing up is something that happens quite rarely lol.