Except for the part about using OCT or DEC to talk about octal and decimal numbers is ok.
From wikipedia:
In programming languages, octal literals are typically identified with a variety of prefixes, including the digit 0, the letters o or q, the digit–letter combination 0o, or the symbol &[12] or $. In Motorola convention, octal numbers are prefixed with @, whereas a small (or capital[13]) letter o[13] or q[13] is added as a postfix following the Intel convention.[14][15] In Concurrent DOS, Multiuser DOS and REAL/32 as well as in DOS Plus and DR-DOS various environment variables like $CLS, $ON, $OFF, $HEADER or $FOOTER support an \nnn octal number notation,[16][17][18] and DR-DOS DEBUG utilizes \ to prefix octal numbers as well.
For example, the literal 73 (base 8) might be represented as 073, o73, q73, 0o73, \73, @73, &73, $73 or 73o in various languages.
Newer languages have been abandoning the prefix 0, as decimal numbers are often represented with leading zeroes. The prefix q was introduced to avoid the prefix o being mistaken for a zero, while the prefix 0o was introduced to avoid starting a numerical literal with an alphabetic character (like o or q), since these might cause the literal to be confused with a variable name. The prefix 0o also follows the model set by the prefix 0x used for hexadecimal literals in the C language; it is supported by Haskell,[19] OCaml,[20] Python as of version 3.0,[21] Raku,[22] Ruby,[23] Tcl as of version 9,[24] PHP as of version 8.1,[25] Rust[26] and it is intended to be supported by ECMAScript 6[27] (the prefix 0 originally stood for base 8 in JavaScript but could cause confusion,[28] therefore it has been discouraged in ECMAScript 3 and dropped in ECMAScript 5[29]).
I think 0o31 would be the "correctish" way a programmer/computer scientist would talk about it.
I just wanted a short explanation.
Is this even right?
Yes, that is correct.
Cool, the plausible answers are always the most dangerous.
Is anyone else bothered by how often things are reiterated in this reply?
Except for the part about using OCT or DEC to talk about octal and decimal numbers is ok.
From wikipedia:
I think 0o31 would be the "correctish" way a programmer/computer scientist would talk about it.