ISO 8601

Date/time separation, further component separation inside those, different radices of those, arbitrary units, calendars, leap seconds and days, time zones, and finally awkward time formats – all those are arbitrary to various extent (some are loosely based on observed cycles), and lead to unnecessary complexity and bugs in software, occasionally inspiring programmers to get creative and make it worse with new encodings.

ISO 8601 was supposed to at least alleviate the time format issues, but perceived user preferences tend to triumph over sanity, so at best it gets used in systems internally.

A way to share approximate timestamps between programs, even if not a sane way to work with those, would still be useful, but what we've got next is incompatible ISO 8601 implementations. For instance, the paywalled text of the standard says that a dot or a comma can be used to separate the fractional part; of course some programs print and/or parse only dot, others – only comma, yet others – just whatever they'll feel like at the moment, depending on locale or phase of the moon. Some would be rather liberal with time zone designators, while local time is allowed, and some won't parse what they've printed. Those aren't amateur projects, and they tend to stay the same after bug reports – for the usual WONTFIX reasons such as backwards compatibility, not being important or an issue at all, possibly redirecting to related projects where the report dies, etc.

And then there are profiles, particularly RFC 3339: it's supposed to resolve the ambiguities and generally be more useful for interoperability (and it succeeds at that), but it is quite restricteve, which may be a problem if you only need dates, or a higher precision.

The issue is not specific to this standard, since people do it with standards all the time, yet the ambiguity in the standard doesn't help. Though it's still better than if there was no standard at all: at least most of the time it's possible to guess what's wrong and find a workaround if it's known that given or accepted timestamps are in someone's image of ISO 8601.

Yet another issue, also common among formats, is that it's not easy to parse without a proper parser. One should use a proper parser, but for most that's arcane art, for some it's unnecessary and/or undesired dependency, and it seems like most of the time parsing is done with regular expressions, scanf(3), various string manipulation and conversion functions. I used to think that sticking to regular grammars may help to ensure correct implementations, but apparently it won't help much either, while a strict and as dumb as possible format probably would; there's inherent complexity of time components, of course, but ISO 8601 introduces alternatives that could have been avoided.

A nice way to deal with time would probably involve natural units (e.g., Planck time) and a better model (relativistic), probably logarithmic scale and base 2 or 3. But using something like that now would just introduce a new, particularly weird format, with complex conversion rules, with implementations dependent on common units anyway. Perhaps the closest commonly used thing to it is Unix time, but opting even for that would likely introduce unnecessary incompatibility issues, and/or require frequent conversions, since the used time components are embedded into cultures – which don't seem like they should affect technologies any deeply, but they do.

It would have helped if projects didn't reimplement it on their own, but used some common library with proper ISO 8601 printing and parsing (and perhaps other time-related functions), but as usual, it's not something that is likely to happen: native implementations are usually preferred, sometimes even "native" to a program in question.

I keep writing and removing a rant like this every few years (sometimes ranting about other units of measurement as well), since it's not a useful note, but perhaps should finally leave it here. At least it's not likely to become outdated in the observable future.