It's fine if you're working with things more abstract than numbers. You see O used in computational complexity notation all the time, and O is often used for a structure sheaf which will vary based on context, like in algebraic geometry. I think in general really it's not an issue as long as you're not discussing functions where the reader might consider that zero could be a possible input, like trigonometric functions for example.
And who uses "o" as a variable? Basically guaranteed unreadability.
Whatever math detector they're using is hair-trigger and wrong.
Scientists who aren't mathematicians, usually.
Doctor Leo Spaceman most likely
It's fine if you're working with things more abstract than numbers. You see O used in computational complexity notation all the time, and O is often used for a structure sheaf which will vary based on context, like in algebraic geometry. I think in general really it's not an issue as long as you're not discussing functions where the reader might consider that zero could be a possible input, like trigonometric functions for example.