To the editors.
Thanks to Cecil Adams for the beaut on calculating pi out to umpteen decimal places [March 16]. Absurd though it is, I can envision one good reason that some computer guys may do it: they use it as a standard, endless task with which to calibrate their computers' speeds.
Besides something like that, though, it's even more absurd than the "learned treatise" that you quoted said it was: "thirty-nine places of pi suffice for computing the circumference of a circle girdling the known universe with an error no greater than the radius of a hydrogen atom." Actually, only 35 digits to the left of the decimal are required. Don't panic: I won't leave you hanging. Here's why: a reasonable value for the radius of the universe is 2 X 10 angstroms. That's just the product of 20 billion years (the time since the big bang) and the speed of light (an upper limit on the rate of expansion). Since pi equals the circumference of the "girdle" (what kind of learned guy would use a word like that anyway) divided by twice the radius, the uncertainty in pi equals the uncertainty in the circumference (one half angstrom, the radius of a hydrogen atom) divided by twice the radius. That's (1/2/(2[2 x 10]) or 1/(8 x 10) or about 10[-35]. Knowing pi to 39 decimal places would nearly suffice for computing the circumference of a circle enclosing the known universe with an error no greater than the nucleus of a hydrogen atom, and that's a whole lot smaller than the entire atom. I'm sure you'd want to get a thing like that straight.
Dr. Neil Basescu
Cecil Adams replies:
I knew that number sounded fishy. Thanks for clearing things up.