This site is supported by donations to The OEIS Foundation.
Precision
Precision is the quality of something, such as a number or a proposition, being stated precisely.
In regards to numeric computations, precision often refers to how many digits are given exactly. For example, 0.001953125 is a precise reading of , while 0.000976563 is an imprecise reading of . 3.14 is an imprecise reading of frequently encountered on standardized tests. But computing thousands of digits has no physical necessity, as just 39 digits will do for almost any conceivable scientific application.[1] "Four decimal places are sufficient for the design of the finest engines, ten decimal places would be sufficient to obtain the circumference of the earth within a fraction of an inch if the earth were a smooth sphere."[2]
The issue of numeric precision is of course not limited to irrational numbers, and per the frivolous theorem of arithmetic is applicable to almost all integers. Beyond 15!, most scientific calculators give factorials imprecisely (and this is allowing for the use of scientific notation, e.g., 15! = 1.307674368 × 1012). In the Data field of OEIS sequence entries, full precision is expected, but it is acceptable to make remarks like "Next term, approximately 1.340780793 × 10154 is too large to include here" in the Comments field when appropriate.
See also: Help:Calculation.
- ↑ Jamie Condliffe, "How Many Digits of Pi Do You Really Need?" Gizmodo.com, February 21, 2013
- ↑ Petr Beckmann, A History of π 5th Ed. Boulder, Colorado: The Golem Press (1982): p. 100