Timezones; sure. But what about before timezones got into use? Or even halfway through - which timezone, considering Königsberg used CET when it was part of Germany, but switched to EET after it became Russian. There's even countries that have timezones differenting by 15 minutes.
And dont get me started on daylight savings time. There's been at least one instance where DST was - and was not - in use in Lebanon - at the same time! Good luck booking an appointment...
Not to mention the transition from Julian calendar to Gregorian, which took place over many, many years - different by different countries - as defined by the country borders at that time...
We've even had countries that forgot to insert a leap day in certain years, causing March 1 to occur on different days altogether for a couple of years.
Time is a mess. Is, and aways have been, and always will be.
What they did instead was to "smear" it across the day, by adding 1 / 86400 seconds to every second on 31st Dec. 1/86400 seconds is well within the margin of error for NTP, so computers could carry on doing what they do without throwing errors.
Edit: They smeared it from noon before the leap second, to the noon after, i.e 31st Dec 12pm - 1st Jan 12pm.
- system clock drift. Google's instances have accurate timekeeping using atomic clocks in the datacenter, and leap seconds smeared over a day. For accurate duration measurements, this may matter.
- consider how the time information is consumed. For a photo sharing site the best info to keep with each photo is a location, and local date time. Then even if some of this is missing, a New Year's Eve photo will still be close to midnight without considering its timezone or location. I had this case and opted for string representations that wouldn't automatically be adjusted. Converting it to the viewer's local time isn't useful.
Confusion about special and general relativity accounts for almost none of the problems that programmers encounter in practice. If that's your use case, then fine, time is special and tricky.
The most common issue is a failure to separate models vs. view concepts. e.g. timestamps are a model, but local time, day of the week, leap seconds are all view concepts. The second most common issue is thinking that UTC is suitable to use as a model, instead of the much more reasonable TAI64. After that it's probably the difference between scheduling requests vs. logging what happened. "The meeting is scheduled next Wednesday at 9am local time" vs. "the meeting happened in this time span". One is a fact about the past, and the other is just scheduling criteria. It could also be something complicated like "every other Wednesday", or every "Wednesday on an even number day of the month". Or "we can work on this task once these 2 machines are available", "this process will run in 2 time slots from now", etc.
I know you had to limit the length of the post, but time is an interest of mine, so here's a couple more points you may find interesting:
UTC is not an acronym. The story I heard was the English acronym would be "CUT" (the name is "coordinated universal time") and the French complained, the French acronym would be "TUC" and the English-speaking committee members complained, so they settled for something that wasn't pronouncable in either. (FYI, "ISO" isn't an acronym either!)
Leap seconds caused such havoc (especially in data centers) that no further leap seconds will be used. (What will happen in the future is anyone's guess.) But for now, you can rest easy and ignore them.
I have a short list of time (and NTP) related links at <https://wpollock.com/Cts2322.htm#NTP>.
I see someone else is a Vernor Vinge fan.
But it's kind of a wild choice for an epoch, when you're very likely to be interfacing with systems whose Epoch starts approximately five months later.
But I hate how when I stack my yearly weather charts, every four years either the graph is off by one day so it is 1/366th narrower and the month delimiters don't line up perfectly, or i have to duplicate Feb 28th so there is no discontinuity in the lines. Still not sure how to represent that, but it sure bugs me.
Well, how do we know what timezone is "2026-06-19 07:00" in, to be able to know that the time rules for that timezone have changed, if we do not store the timezone?
Additionally, how do we really "detect that the time rules for that timezone have changed"? We can stay informed, sure, but is there a way to automate this?
My guess is that with the increasing dependency on digital systems for our lives the edge-cases where these rules aren't properly updated cause increased amounts of pain "for no good reason".
In Brazil we recently changed our DST rules, it was around 2017/2018. It caused a lot of confusion. I was working with a system where these changes were really important, so I was aware of this change ahead of time. But there are a lot of systems running without too much human intervention, and they are mostly forgotten until someone notices a problem.
The standard name for durations in physics are "periods" or 'uppercase T' ('lowercase t' being a point in time), which curiously enough are the inverse of a frequency (or the frequency is the inverse of). A period can also be thought of as an interval [t0,t1] or inequality t0<=T<=t1
> The concept of "absolute time" (or "physical/universal time") refers to these instants, which are unique and precisely represent moments in time, irrespective of concepts like calendars and timezones.
Funnily enough, you mean the opposite. An absolute time physically does not exist, like an absolute distance, there is no kilometer 0. Every measurement is relative to another, in the case of time you might use relative to the birth of (our Lord and saviour) Jesus Christ. But you never have time "irrespective" of something else, and if you do, you are probably referring to a period with an implicit origin. For example if I say a length of 3m, I mean an object whose distance from one end to the other is 3m. And if I say 4 minutes of a song, I mean that the end is 4 minutes after the start, in the same way that a direction might be represented by a 2D vector [1,1] only because we are assuming a relationship to [0,0].
That said, it's clear that you have a lot of knowledge about calendars from a practical software experience of implementing time features in global products, I'm just explaining time from the completely different framework of classical physics, which is of course of little use when trying to figure out whether 6PM in Buenos Aires and 1 PM in 6 months in California will be the same time.
TAI provides a time coordinate generated by taking the weighted average of the proper times of 450 world lines tracked by atomic clocks. Like any other time coordinate, it provides a temporal orientation but no time coordinate could be described as "universal" or "linear" in general relativity. It would be a good approximation to proper time experienced by most terrestrial observers.
Note that general relativity doesn't add much over special relativity here (the different atomic clocks will have different velocities and accelerations due to altitude and so have relative differences in proper time along their world lines). If you already have a sufficiently general notion of spacetime coordinates, the additional curvature from general relativity over minkowski space is simply an additional effect changing the relation between the coordinate time and proper time.
This article explains it really well. The part about leap seconds especially got me. We literally have to smear time to keep servers from crashing. That’s kind of insane.
https://gist.github.com/timvisee/fcda9bbdff88d45cc9061606b4b...
In a nutshell if you believe anything about time, you're wrong, there is always an exception, and an exception to the exception. And then Doc Brown runs you over with the Delorean.
Instead I mostly use time for durations and for happens-before relationships. I still use Unix flavor timestamps, but if I can I ensure monotonicity (in case of backward jumps) and never trust timestamps from untrusted sources (usually: another node on the network). It often makes more sense to record the time a message was received than trusting the sender.
That said, I am fortunate to not have to deal with complicated happens-before relationships in distributed computing. I recall reading the Spanner paper for the first time and being amazed how they handled time windows.
Things like timestamps are used to track things like file creation, transaction processing, and other digital events.
As computers and networks have become increasingly fast, the accuracy of the timestamps becomes more and more critical.
While the average human doesn't care if a file was created at a time calculated down to the nanosecond; it is often important to know if it was created before or after the last backup snapshot.
I'm bookmarking this article to hand out to new developers.
"Wanna grab lunch at 1,748,718,000 seconds from the Unix epoch?"
I'm totally going to start doing that now.
Ooh, this is a really interesting topic!
Okay, so the first thing to keep in mind is that there are three very important cyclical processes that play a fundamental role in human timekeeping and have done so since well before anything we could detect archaeologically: the daily solar cycle, the lunar cycle (whence the month), and the solar year. All of these are measurable with mark 1 human eyeballs and nothing more technologically advanced than a marking stick.
For most of human history, the fundamental unit of time from which all other time units are defined is the day. Even in the SI system, a second wasn't redefined to something more fundamental than the Earth's kinematics until about 60 years ago. For several cultures, the daylight and the nighttime hours are subdivided into a fixed number of periods, which means that the length of the local equivalent of 'hour' varied depending on the day of the year.
Now calendars specifically refer to the systems for counting multiple days, and they break down into three main categories: lunar calendars, which look only at the lunar cycle and don't care about aligning with the solar year; lunisolar calendars, which insert leap months to keep the lunar cycle vaguely aligned with the solar year (since a year is about 12.5 lunations long); and solar calendars, which don't try to align the lunations (although you usually still end up with something akin to the approximate length of a lunation as subdivisions). Most calendars are actually lunisolar calendars, probably because lunations are relatively easy to calibrate (when you can go outside and see the first hint of a new moon, you start the new month) but one of the purposes of the calendar is to also keep track of seasons for planting, so some degree of solar alignment is necessary.
If you're following the history of the Western calendrical tradition, the antecedent of the Gregorian calendar is the Julian calendar, which was promulgated by Julius Caesar as an adaptation of the Egyptian solar calendar for the Romans, after a series of civil wars caused the officials to neglect the addition of requisite leap months. In a hilarious historical example of fencepost errors, the number of years between leap years was confused and his successor Augustus had to actually fix the calendar to have a leap year every 4th year instead of every third year, but small details. I should also point out that, while the Julian calendar found wide purchase in Christendom, that didn't mean that it was handled consistently: the day the year started varied from country to country, with some countries preferring Christmas as New Years' Day and others preferring as late as Easter itself, which isn't a fixed day every year. The standardization of January 1 as New Years' Day isn't really universal until countries start adopting the Gregorian calendar (the transition between Julian and Gregorian calendar is not smooth at all).
Counting years is even more diverse and, quite frankly, annoying. The most common year-numbering scheme is a regnal numbering: it's the 10th year of King Such-and-Such's reign. Putting together an absolute chronology in such a situation requires accurate lists of kings and such that is often lacking; there's essentially perennial conflicts in Ancient Near East studies over how to map those dates to ones we'd be more comfortable with. If you think that's too orderly, you could just name years after significant events (this is essentially how Winter Counts work in Native American cultures); the Roman consular system works on that basis. If you're lucky, sometimes people also had an absolute epoch-based year number, like modern people largely agree that it's the year 2025 (or Romans using 'AUC', dating the mythical founding of Rome), but this tends not to be the dominant mode of year numbering for most of recorded human history.