Imagine you heard someone offhandedly say “The distance from London to Rome is about 1000 miles.” Now say you are more comfortable in SI units so you do a quick conversion in your head and think “So the distance from London to Rome is about 1600 km, huh?”

A mistake has been made here. Units are used by humans (and presumably other sapient lifeforms) to give meaning to numerical values by providing standards with which to measure them against (“The Earth is 6400 across.” “6400 *what*?”). A conversion from one unit system to another should not affect accuracy or precision, since they are just standard references. And yet in the above example, precision was seemingly increased by the conversion.

When we cite values like “1000 miles”, there are a certain number of assumptions that go with it. For one, we typically don’t think it means *exactly* 1000 miles. Maybe it was 999.97 miles, but in everyday parlance that doesn’t matter. In fact, a rule of thumb might be to allow for an “off-by-one” error in the last significant digit in everyday situations, so “1000 miles” really translates to “somewhere between 900 and 1100 miles with pretty high confidence.” The second assumption is that the only meaningful digits are those that aren’t zeros.

It’s much more clear if we use scientific notation. There’s a difference in precision between miles and miles, though when we read “1000 miles” we assume the former case. But in the above example we went from “1000 miles” to “1600 km” which implied two digits of precision instead of one. By converting between units, we magically improved our precision.

If “1000 miles” really means “900 to 1100 miles with pretty high confidence”, then “1600 km” likewise means “1500 to 1700 km with pretty high confidence”. But performing a more exact conversion, that translates back to about “930 miles to 1060 miles with pretty high confidence”. Our error bands somehow shrunk by converting back and forth; our precision increased when it shouldn’t have.

The solution to this when you’re being precise is to either explicitly state your error margins or use the more exact form of scientific notation. That’s one of the main reasons we teach scientific notation to students, though I doubt it’s widely appreciated by them (the other reason to enhance the ability to perform mental calculations). In print, this issue can be ameliorated by writing something like “1.0 thousand miles” if that’s the precision you require, though it’s a bit awkward. This is one of the many reasons why metric systems are much more convenient, since there’s a simple and obvious way to change by orders of a thousand.

*Summary: Be on the lookout for unit conversions that magically increase precision.*

By the way, the actual distance from London to Rome is 892 miles or 1436 km, as the crow flies. But wait, how did we go from three to four significant figures?