Pop quiz! How long is the average year? If you said “365 days”, then congratulations! You’re wrong. If you said “365.25 days”, then gratulationes – you’re also wrong, and probably a Roman. The average year is, in fact, 365.242 days long – just so long as you’re not talking about the sidereal year.
So what’s that?
What is a sidereal year?
Simply put, a sidereal year – it’s pronounced sy-dear-ee-al, by the way, despite being spelled side-real – is the amount of time it takes for the planet to orbit the Sun once with respect to the fixed stars.
Now, that might sound like a familiar description. In fact, isn’t that the definition of, you know, a regular year? Well, not quite.
“The sidereal year is the time taken for the Earth to complete one revolution of its orbit, as measured against a fixed frame of reference (such as the fixed stars, Latin sidera, singular sidus),” explains Michael J. White, Professor of Philosophy and Law at Arizona State University, in his notes on the various definitions of years. “Its average duration is 365.256363004 mean solar days (365 d 6 h 9 min 9.76 s).”
The regular year, meanwhile – aka the “tropical year” – is “the period of time for the ecliptic longitude of the Sun to increase by 360 degrees,” White continued. “The mean tropical year is approximately 365 days, 5 hours, 48 minutes, 45 seconds.”
See? Totally different: the sidereal year is more than 20 whole minutes shorter than the tropical year!
What’s the point of the sidereal year?
Sidereal years are an old, old concept – coming right from the beginnings of astronomy itself. Back then, life was pretty much dominated by two things: the seasons, which could kill you and everyone you knew by being just a tiny bit too dry or wet or long or short, and the night sky, which was what passed for entertainment.
“A great deal of human effort has been expended over the past 4000 years or so in trying to predict and explain the motions of the Sun, Moon, planets and stars,” wrote Chris Linton, a professor in Loughborough University’s Department of Mathematical Sciences, in his 2004 book From Eudoxus to Einstein: a history of mathematical astronomy.
“For a variety of reasons, early astronomers thought that the Earth was stationary and that the heavenly bodies moved around it,” he explained. They thought the stars were fixed, all equally distant from the Earth and attached to “the celestial sphere” – “a real entity,” Linton wrote, “with the stars attached physically to it.”
Now, the ancients already had a concept of a “year” – after all, at its most basic, a year is just the amount of time it takes for the world to get back to where it was, right? For the Nile to flood or the rains to come, for your crops to grow, for your harvest to come in, for the winter to draw in, and for the whole thing to start all over again. But it doesn’t take long, at least in a time before TV and video games, before you start noticing other patterns that follow this timespan.
The stars, as one notable example, move across the sky in a roughly 365-day march: “Provided we have an accurate means of measuring time, we can observe that the stars actually complete a revolution about the pole in about 23 h 56 min, so that they return to the same place at the same time in one year,” Linton explained. And for early astronomers, that led to a natural conclusion: “If we regard the stars as fixed on the celestial sphere, then the Sun must move relative to the stars in the direction opposite to the diurnal motion,” Linton noted, “completing one circuit of the celestial sphere in a year.”
So the early astronomers just… screwed up?
You’d think so, but no.
Given the minute size of the discrepancy – plus the fact that the world has already rejigged the calendar quite a few times over the years – it’s easy to assume the sidereal year is a relic of our ancestors’ less accurate timekeeping abilities. You know: they were trying to measure a real year, but they got it wrong.
In fact, it’s kind of the opposite. The sidereal year isn’t a result of our ancestors aiming for the right thing, but screwing up the math – they actually did the difficult bits really well. They were just starting from a point of view that included “huge sky-ceiling covered in stars, definitely exists, include it in your calculations.”
And, as primitive as we like to think of our ancestors as being, they did actually know that the sidereal year wasn’t as long as the tropical equivalent. Hipparchus, in the second century BCE, put the discrepancy down to a slow rotation of the celestial sphere about a certain point in the sky – the so-called “poles of the elliptic” – which he measured to be a speed of one degree per century.
He was wrong on that – “the actual value is about one degree every 72 years,” Linton pointed out, which would eventually wreak havoc in astronomy about a millennium later – but it’s worth pointing out that these ancient scientists got everything else pretty much bang on. And considering their starting point, that’s pretty impressive: after all, if the universe actually were a set of concentric physical spheres with us in the middle, figuring out the length of the sidereal year would be not only a natural extension of astronomy, but an important and technically complex calculation.
Of course, it seems pointless to most of us, with our modern ideas like “gravity” and “potentially infinite expanse of space”, but their models were based on “extremely natural assumption[s],” Linton pointed out. “The evidence to the contrary is far from obvious.”
“The fact that the natural interpretation of the situation is wrong is one of the reasons why astronomy has such an absorbing history,” he wrote. “Progress in man’s understanding of the nature of the Universe has not been a gradual refinement of simple and intuitive ideas, but a struggle to replace the seemingly obvious by, what to many, was patently absurd.”