Took me 2 hours to find out why the final output of a neural network was a bunch of NaN. This is always very annoying but I can't really complain, it make sense. Just sucks.
I guess you can always just add an
assert not data.isna().any()
in strategic locationsThat could be a nice way. Sadly it was in a C++ code base (using tensorflow). Therefore no such nice things (would be slow too). I skill-issued myself thinking a struct would be 0 -initialized but
MyStruct input;
would not whileMyStruct input {};
will (that was the fix). Long story.Oof. This makes me appreciate the abstractions in Go. It's a small thing but initializing structs with zero values by default is nice.
As I was coding in C++ my own Engine with OpenGL. I forgot something to do. Maybe forgot to assign a pointer or forgot to pass a variable. At the end I had copied a NaN value to a vertieces of my Model as the Model should be a wrapper for Data I wanted to read and visualize.
Printing the entire Model into the terminal confused me why everything is NaN suddenly when it started nicely.
NaN stands for Not a Number. to simplify very briefly (and not accurate at all), when defining a standard for representing fractional values using binary digits in computers they systematically assigned natural numbers in a range of values to some fractional numbers. some of the possible natural numbers for reasons not worth talking about were unused, so they were designated as NaNs, and the value of the NaN itself is supposed to tell you what went wrong in your calculations to get a NaN. obviously if you use a NaN in an arithmetic operation the result is also Not a Number and that's what the meme is referring to.
i think the real explanation is simpler and more understandable.
NaN is what you get when you do something illegal like dividing by zero. There is no answer, but the operation has to result in something. So it gives you NaN, because the result is literally not a number