You must log in or register to comment.
At my last job I was bored so I wrote sql server functions to perform standard math operations on varchar(max) and used them to build factorial tables which I then used to iteratively calculate pi. I think I got up to around 100 digits before I got yelled at for bogging down the server and had to stop.
You get that level of precision in a standard "
double
" floating point number. So that's basically the normal level of precision you get without trying.
Why stop at 1 billion?... Let's go for a trillion, just because we can.