Recent Posts

Recent Comments

No comments to show.



Once upon a time

Human beings have probably been recording dates since we began writing, about five thousand years ago. It was not until 1582 that Pope Gregory XIII proposed (actually, decreed) a standard calendar. Global adoption took a few centuries but even now we still write dates differently around the world.

We humans can usually infer, from culture and context, whether, for example, 9/11 refers to events in New York on 11 September 2001, or Berlin on 9 November 1989.

Computers are not so smart. For a computer to make a determination of our meaning of 9/11 it needs further qualification. Consequently we have to be very careful when we enter dates to use a format the system can understand and we have to be even more careful when we then transfer that data to another system if we are to avoid ambiguity.

It does not have to be that way. There is a straightforward and elegant international standard for date and time representations: ISO 8601. It is widely used by computers around the world and by human beings in Norway.

ISO 8601 is a big-endian format. Simply stated it works like this:


The beauty of the big-endian format is that dates sort sensibly and that the same format can be used with varying degrees of accuracy from the very specific 2009-03-26T15:54:58 to the more general 2009-03.

This sort of knowledge was once arcane and relevant only to software engineers. Today it is just the sort of consideration that should be as familiar to everyone as our use of the Gregorian Calendar itself.

Recently, the world changed. Sometime in the mid 1990s we passed a tipping point when a perfect storm of price, performance and connectivity in information technology shifted the majority of our communication from analogue to digital media.

In the analogue era we could record information in any sensible format and place the burden of interpreting it on subsequent readers. That is a poor strategy if we want our data to be fully exploitable in the digital era. Instead, we need to reevaluate our received ideas about presenting information, strip away a few layers and see the data. Then we can reap the full benefit of what we have to say being Machine Readable.