When we write "one and a half" in decimal notation as 1.5 .. do we really mean 1

When we write "one and a half" in decimal notation as 1.5 .. do we really mean 1

When we write "one and a half" in decimal notation as 1.5 .. do we really mean 1.5000... (with an infinity of zeros?) If so, how do we know there's no "3" lurking at the 10 millionth decimal place? Is this a problem of converting an analogue world into digital mathematics?

Read another response by Daniel J. Velleman
Read another response about Mathematics
Print