Wednesday 21 January 2015

Precision and accuracy in medieval astronomy

What is the difference between precision and accuracy?

In modern English they are used almost interchangeably.  But there is a difference, of course.  I wonder what time it is now, when you are reading this.  Is it about eleven o' clock?  Or is it 09:34?  Of course, I have no way of knowing which of those guesses is more accurate.  But the second is obviously more precise.

Which of these two timepieces
is more accurate? Well,
they've both stopped...
That distinction may be more or less clear to us.  But that wasn't always the case for medieval astronomers.  What if I were to refine my guess, and say you're reading this at 09:34:27?  Is that any better? It's obviously more precise.  But when is it preferable to be more precise?  The answer to that might be more complicated than it appears.  In general, we might say that precision is only preferable when it increases accuracy.  But medieval scholars didn't always see it the same way.

I study astronomical tables.  Take a look at this amazing digitised version for an example.  That link points to a table of the daily precession of the stars and planetary apogees.  It's a lot more exciting than it sounds!

(Here's a brief astronomical explanation: skip it if you want...  Precession is the phenomenon that means that the stars appear to move very gradually around the sky, so that they're not in exactly the same place from year to year.  I don't mean the obvious daily rotation around the North Star that's caused by the Earth spinning on its axis - I mean a much slower change, caused by a "wobble" in the tilt of the Earth's axis.  The stars are moving 1° every 72 years - pretty hard to spot, but it explains why, right now, the Sun is still just about "in" the constellation Sagittarius (i.e. in front of those stars) even though it ought to be passing from Capricorn into Aquarius.  To be clear: the astrologers haven't got that wrong, because when they say it's the cusp of Aquarius, they mean the Sun has gone 120° around the sky (in modern terms, we've completed a third of our orbit) since the last equinox.  It's just that the background of stars has moved since the Ancient Greeks assigned them to their positions between equinoxes and solstices.)

So what?  The point is, precession is a VERY slow motion.  It's obviously almost impossible to observe with the naked eye.  It's impressive enough that ancient astronomers had even noticed it, so we shouldn't be surprised that their estimate of the rate was a bit different from ours.  That's why the table I linked above represents a precession of 1° every 136 years (their theory of precession included a separate, non-linear component that made up most of the difference).

But I said above that that table is a table of DAILY precession.  What's the point of tabulating daily values for something that changes one degree every 136 years?!

Good question! Here's another one: What's the point of tabulating those daily values to a precision of billionths of billionths of degrees?!  I don't even know what a billionth of a billionth of a degree is called, but that is the precision represented by the daily value of 0;0,0,4,20,41,17,12,26,37.  (That's a sexagesimal number: 0°, 0 minutes, 0 seconds, 4 thirds... In decimal terms it's 0.0000201148235466718.)  The 37 in the final column of the table is 3.67 x 10-15.  To put that in context, that's one 98,000,000,000,000,000th part of a complete circle. It would take approximately 750 billion years for these daily 37s to accumulate to even a degree’s difference.

That level of precision in the tables clearly didn't arise from naked-eye observation of the stars.  No, it's a result of the way the tables were computed.  And astronomers clearly realised that - they understood that such precision was unobservable.  Yet they maintained it when they copied and recomputed the tables.  Why?  Because, I suppose, they reckoned that more precision is better than less.  To put it another way: you say why keep those 37s?  They would say, what makes you so sure you can get rid of them?

Isn't that silly?  Hold on a moment - you may not be much better.  A friend of mine recently posted this on Facebook:


I know how these things work: authors of recipe books work out their recipes in their own ways.  Delia Smith was obviously used to using pounds and ounces.  She used 2 oz of sugar.  2 oz is about 56.75g, but no editor will let that go into the published cookbook.  So it gets rounded down to 50g.  Then when Delia calls for 6 oz (about 170.25g), it gets rounded up to 175g.

Here's the weird bit: I know this is what's happening - I even have a magnetic converter on my fridge door that tells me that 2 oz is 50g and 6 oz is 175g.  But that doesn't stop me measuring out the quantities with exaggerated care, paying attention to the slightest fluctuation on the scales.  And what about the eggs?  I'm precise to the last gram of sugar even in recipes that use eggs, when I'm well aware that the size of eggs can vary widely.  If I can sustain this kind of cognitive dissonance, perhaps I shouldn't be too critical of the medieval astronomers.

No comments:

Post a Comment

Note: only a member of this blog may post a comment.