A brief history of time systems

Time is an illusion. Lunchtime doubly so. -- Douglas Adams

(Woman to Yogi Berra) : What time is it? (Berra) : You mean right now?

Everything becomes subjective and observer-oriented when you study relativity. Or Buddhism. Or get drafted. -- Joe Haldeman

This is highly abridged and intended just to provide some explanation of time systems used on this site. You can click here for a good explanation of several other time scales.

• UTC (Coordinated Universal Time), known by its French acronym. This is "official, standard time"; the time on a clock or computer is usually equal to this, adjusted by a certain number of hours to get your local time. It's sometimes called GMT (Greenwich Mean Time), but this is not quite right. GMT is an older system, now obsolete for decades, which morphed into UTC.

UTC is based on the average of atomic clocks all around the world. It's also required that it be synchronized to within one second of mean solar time at Greenwich. This is a problem, since MST depends on the earth's not entirely regular rotation. Time measured by MST is going to gain or lose a second every now and then relative to time measured by atomic clocks.

To get around this, leap seconds are sometimes inserted at the end of the last day of December or June; instead of having the usual 86400 seconds, those days have 86401 seconds. That keeps UTC roughly in sync with the earth's rotation. The International Earth Rotation & References Systems Service (IERS) keeps track of the earth's actual rotation and determines when a leap second needs to be inserted. (In theory, they might even need to delete a second if the earth sped up, so that a day at the end of June or December would have only 86399 seconds. That doesn't appear likely to happen, though.)

Unfortunately, the earth's rotation varies unpredictably due to climate and geology (earthquakes, for example). So you can't really know far in advance when a leap second will be inserted; UTC is well-defined for past years and for the next six months or so, but is more and more hard to determine for the future. UTC is also annoying in that you can have a time such as 23:59:60.812 UTC. Astronomers tend to do most of their math using TT (described below), converting to UTC when it's time to show data in a form suitable for comparing to a clock on a wall.

• TT (Terrestrial Time) is the average time as measured by atomic clocks on the surface of the earth. It runs at the same rate as UTC, but "slips" relative to UTC by a second when a leap second is inserted. TT makes for a logical system for astronomers. With UTC, 23:59:00 on one day and 00:01:00 on the next may be two minutes apart, or they may be 121 seconds apart. TT doesn't have that problem.

In early 2018, TT - UTC = 69.1840 seconds, exactly. (I.e., if it's midnight UTC, it's 00:01:09.1840 TT.) They've been that far apart since the last leap second was inserted, at the end of 2017. From 2015 July 1 to 2017 Jan 1, TT - UTC was exacty 68.1840 seconds. At some point (probably at the end of 2019), IERS will have to announce another leap second. When that is inserted, we'll have TT - UTC = 70.1840 seconds, again exactly.

• TDB (Barycentric Dynamical Time) is the time scale used for fundamental solar system ephemerides. In theory, at least, it should be the time a clock would measure if it were well outside the solar system and at rest relative to it, with a slight scaling factor to make it run at the same rate (on average) as clocks on the surface of the earth.

The idea here is that if you put a clock on a rocket and went far outside the solar system, slowed down to rest relative to the center of mass of the solar system (the barycenter), and compared the speed of your clock to those back on earth (the ones still measuring Terrestrial Time, you'd notice two differences. First, your clock would be running faster, by about 1.5 millionths of one percent, because you'd be outside the gravitational effects of the earth, sun, and everything else in the solar system.

Second, you'd notice some periodic differences, mostly of about 1.7 milliseconds as the earth went around the sun, getting closer and further away from it. (Plus smaller differences as it got closer and further from the moon and planets.) TDB would be what you'd get if you adjusted your clock to stay, on average, in synch with TT, ignoring those periodic differences.

1.7 milliseconds may not seem like much, and this difference actually is ignored in many astronomical computations. For some that involve serious precision, though, it starts to become quite important (radar observations, VLBI, etc.)

Mean Solar Time (MST) is the time scale you get if you set a clock such that the sun is, on average, due south at noon. I say "on average" because over the course of a year, the sun will appear to sometimes run fast or slow by up to about 16 minutes. Note also that MST depends on where you are in the world; move one degree of longitude east, and the MST you'd measure would move ahead by four minutes (1/360 of a day).

• The Julian Day (JD) is a simple count of days relative to noon on -4712 January 1. That's sufficiently long ago that the number is almost always positive, even when discussing truly ancient events. Current JD values are around 2.46 million.

• The Modified Julian Day (MJD) is a similar scheme, except the starting point is midnight (not noon) on 1857 Nov 17 0:00 = JD 2450000.5 = MJD 0.0. This fixes two problems with the JD scheme. First, it "rolls over" to a new day at noon. For astronomers in Europe, this was actually a feature, not a bug. It meant that all observations made on a given night would take place on the same (integer) JD. For the rest of us, though, it's not helpful.

The second "problem" it fixes is that it means that, for dates since 1857 Nov 17, the date count is usually a five-digit number plus fraction, rather than a seven-digit one plus fraction. E.g., 2018 Feb 17 03:00 = JD 2458166.625 = MJD 58166.125.