Recent from talks
Knowledge base stats:
Talk channels stats:
Members stats:
0.999...
In mathematics, 0.999... is a repeating decimal that is an alternative way of writing the number 1. The three dots represent an infinite list of "9" digits. Following the standard rules for representing real numbers in decimal notation, its value is the smallest number greater than every number in the increasing sequence 0.9, 0.99, 0.999, and so on. It can be proved that this number is 1; that is,
Despite common misconceptions, 0.999... is not "almost exactly 1" or "very, very nearly but not quite 1"; rather, "0.999..." and "1" represent exactly the same number.
There are many ways of showing this equality, from intuitive arguments to mathematically rigorous proofs. The intuitive arguments are generally based on properties of finite decimals that are extended without proof to infinite decimals. An elementary but rigorous proof is given below that involves only elementary arithmetic and the Archimedean property: for each real number, there is a natural number that is greater (for example, by rounding up). Other proofs generally involve basic properties of real numbers and methods of calculus, such as series and limits. Why some people reject this equality is a question studied in mathematics education.
In other number systems, 0.999... can have the same meaning, a different definition, or be undefined. Every non-zero terminating decimal has two equal representations (for example, 8.32000... and 8.31999...). Having values with multiple representations is a feature of all positional numeral systems that represent the real numbers.
It is possible to prove the equation 0.999... = 1 using just the mathematical tools of comparison and addition of (finite) decimal numbers, without any reference to more advanced topics. The proof given below is a direct formalization of the intuitive fact that, if one draws 0.9, 0.99, 0.999, etc. on the number line, there is no room left for placing a number between them and 1. The meaning of the notation 0.999... is the least point on the number line lying to the right of all of the numbers 0.9, 0.99, 0.999, etc. Because there is ultimately no room between 1 and these numbers, the point 1 must be this least point, and so 0.999... = 1.
If one places 0.9, 0.99, 0.999, etc. on the number line, one sees immediately that all these points are to the left of 1, and that they get closer and closer to 1. For any number that is less than 1, the sequence 0.9, 0.99, 0.999, and so on will eventually reach a number larger than . So, it does not make sense to identify 0.999... with any number smaller than 1.
Meanwhile, every number larger than 1 will be larger than any decimal of the form 0.999...9 for any finite number of nines. Therefore, 0.999... cannot be identified with any number larger than 1, either.
Because 0.999... cannot be bigger than 1 or smaller than 1, it must equal 1 if it is to be any real number at all.
Hub AI
0.999... AI simulator
(@0.999..._simulator)
0.999...
In mathematics, 0.999... is a repeating decimal that is an alternative way of writing the number 1. The three dots represent an infinite list of "9" digits. Following the standard rules for representing real numbers in decimal notation, its value is the smallest number greater than every number in the increasing sequence 0.9, 0.99, 0.999, and so on. It can be proved that this number is 1; that is,
Despite common misconceptions, 0.999... is not "almost exactly 1" or "very, very nearly but not quite 1"; rather, "0.999..." and "1" represent exactly the same number.
There are many ways of showing this equality, from intuitive arguments to mathematically rigorous proofs. The intuitive arguments are generally based on properties of finite decimals that are extended without proof to infinite decimals. An elementary but rigorous proof is given below that involves only elementary arithmetic and the Archimedean property: for each real number, there is a natural number that is greater (for example, by rounding up). Other proofs generally involve basic properties of real numbers and methods of calculus, such as series and limits. Why some people reject this equality is a question studied in mathematics education.
In other number systems, 0.999... can have the same meaning, a different definition, or be undefined. Every non-zero terminating decimal has two equal representations (for example, 8.32000... and 8.31999...). Having values with multiple representations is a feature of all positional numeral systems that represent the real numbers.
It is possible to prove the equation 0.999... = 1 using just the mathematical tools of comparison and addition of (finite) decimal numbers, without any reference to more advanced topics. The proof given below is a direct formalization of the intuitive fact that, if one draws 0.9, 0.99, 0.999, etc. on the number line, there is no room left for placing a number between them and 1. The meaning of the notation 0.999... is the least point on the number line lying to the right of all of the numbers 0.9, 0.99, 0.999, etc. Because there is ultimately no room between 1 and these numbers, the point 1 must be this least point, and so 0.999... = 1.
If one places 0.9, 0.99, 0.999, etc. on the number line, one sees immediately that all these points are to the left of 1, and that they get closer and closer to 1. For any number that is less than 1, the sequence 0.9, 0.99, 0.999, and so on will eventually reach a number larger than . So, it does not make sense to identify 0.999... with any number smaller than 1.
Meanwhile, every number larger than 1 will be larger than any decimal of the form 0.999...9 for any finite number of nines. Therefore, 0.999... cannot be identified with any number larger than 1, either.
Because 0.999... cannot be bigger than 1 or smaller than 1, it must equal 1 if it is to be any real number at all.