History of Reliance on timezone data
<p>Following an off-list post to one on the tz list a thought came to me about the idea of loosing timezones. I started working with what was 'tikiwiki' back then and ported it to Firebird which was still Interbase back then so some time ago now. Firebird only ever worked with 'local' time internally which was naturally UTC and we have always simply used tz data to provide the correct time display to users ... despite the simple fact that browsers do not ACTUALLY provide a tz ident, only the current time offset so one could ONLY work accurately for users who had already logged in and set their tz to provide daylight saving offsets.</p>
<p>tikiwiki stored everything in 'user local time' and had some cleaver juggling to return what was purported to be UTC, but was wrong more often than not, usually when looking at the calendar 6 months later. They then converted timestamps to the client users local time ... with similar errors. So my first change for my own port was to switch to only using UTC as that was natural for Firebird, and then one only translated a users local time to UTC. This still had the same problem if the user was not logged in you still had no DST data to work with, but you just blocked those sort of problem conversions until the user set a current tz. This is still the case even today. NOBODY seems to be bothered that the browser tz tag is simply unusable?</p>
<p>What I have realised is that it's the obsession of using timezone as an integral part of a timestamp which is where everything goes wrong and that there is little point to doing that. We have avoided adding timezone to the Firebird timestamp datatype and the reasoning behind that is even more relevant today. The only timestamp value that is needed is the UTC normalized 'number' as it is simply how that is displayed which is of any interest. It's a discussion that has been had before, and yes there is a need to store a timezone in the same record as the timestamp, but that is only to identify the location at which the date is relevant. This way one only ever needs the timezone data to display a date and time. The default being UTC where we have no identifying tz, but normally it will be the location tz or the user's tz depending on what information is required to be displayed. For a conference call meeting, one needs to know the time of the meeting at the host location as well as your local time for that meeting, and if it's being set up across a DST change, both sites may well have different DST rules in addition to the simple time zone.</p>
<p>There has been comment on the situation where you do not know the tz for a timestamp, and that simply means that tz of the stored timestamp is 'Unknown' rather than UTC normalized. However it is the context that the timestamp relates to that will give an indication as to how it should be normalized, and the information relating to that identification may well be more than a simple tz identification anyway? As a very minimum we use an offset provided by the longitude of the location to provide an offset, but it is the uncertainty on when that became simplified by the basic timezone segments which is one area of contention. In practice certainly time data prior to the advent of more accurate timing devices would be based on measurements around mid day so that normalization is the best estimate, but with an uncertainty that may well be hours. For genealogical data, the level of accuracy is important, and when constructing time lines, normalization allows the timing of events to be accurately compared.</p>
<p>While we have concentrated on normalizing time elements, of cause where we are working with dates, it may be that a normalized date is a day earlier or later than the recorded date. The accuracy of the data may well mean that there is 24 hours of uncertainty, at which point working with a mid point of midday makes sense. The uncertainty windows is around the event location and it is that which gets normalized.</p>
<p>As a side line, since it is astronomical data that is often used to help identify when events happened in the past, how the astronomers record time should be a useful cross reference. It turns out that while the question of 'where' has been resolved starting back in 2002 with a <a href="http://www.aanda.org/index.php?option=com_article&access=bibcode&Itemid=129&bibcode=2002A%2526A...395.1061GFUL">standard for World Coordinates</a>, currently the 'when' is still a standard under discussion. Of cause the problem of where is one which needs care since while we can probably get away with Longitude and Latitude on a perfect sphere, the Earth's surface deviates as much from that as the as much as the length of a solar day varies month on month. 'Very Long baseline array' radio telescopes probably show up just about every problem of providing an accurate time. The physical location of each antenna has to be accurately defined so that the physical distance from the base line of the direction being targeted gives a time offset for each received signal. Add to this the time taken to transmit data across what may well be several time zones to a central processor one understands why the recording process needs to be able to support a level of time accuracy far beyond simple history. Certainly one appreciates better why 128bit accuracy is almost essential in <a href="http://hea-www.cfa.harvard.edu/~arots/TimeWCS/WCSPaper-IV-v1.03A4.pdf">the draft specification</a>, but many of us are happy working to the nearest second ...</p>
<p>tikiwiki stored everything in 'user local time' and had some cleaver juggling to return what was purported to be UTC, but was wrong more often than not, usually when looking at the calendar 6 months later. They then converted timestamps to the client users local time ... with similar errors. So my first change for my own port was to switch to only using UTC as that was natural for Firebird, and then one only translated a users local time to UTC. This still had the same problem if the user was not logged in you still had no DST data to work with, but you just blocked those sort of problem conversions until the user set a current tz. This is still the case even today. NOBODY seems to be bothered that the browser tz tag is simply unusable?</p>
<p>What I have realised is that it's the obsession of using timezone as an integral part of a timestamp which is where everything goes wrong and that there is little point to doing that. We have avoided adding timezone to the Firebird timestamp datatype and the reasoning behind that is even more relevant today. The only timestamp value that is needed is the UTC normalized 'number' as it is simply how that is displayed which is of any interest. It's a discussion that has been had before, and yes there is a need to store a timezone in the same record as the timestamp, but that is only to identify the location at which the date is relevant. This way one only ever needs the timezone data to display a date and time. The default being UTC where we have no identifying tz, but normally it will be the location tz or the user's tz depending on what information is required to be displayed. For a conference call meeting, one needs to know the time of the meeting at the host location as well as your local time for that meeting, and if it's being set up across a DST change, both sites may well have different DST rules in addition to the simple time zone.</p>
<p>There has been comment on the situation where you do not know the tz for a timestamp, and that simply means that tz of the stored timestamp is 'Unknown' rather than UTC normalized. However it is the context that the timestamp relates to that will give an indication as to how it should be normalized, and the information relating to that identification may well be more than a simple tz identification anyway? As a very minimum we use an offset provided by the longitude of the location to provide an offset, but it is the uncertainty on when that became simplified by the basic timezone segments which is one area of contention. In practice certainly time data prior to the advent of more accurate timing devices would be based on measurements around mid day so that normalization is the best estimate, but with an uncertainty that may well be hours. For genealogical data, the level of accuracy is important, and when constructing time lines, normalization allows the timing of events to be accurately compared.</p>
<p>While we have concentrated on normalizing time elements, of cause where we are working with dates, it may be that a normalized date is a day earlier or later than the recorded date. The accuracy of the data may well mean that there is 24 hours of uncertainty, at which point working with a mid point of midday makes sense. The uncertainty windows is around the event location and it is that which gets normalized.</p>
<p>As a side line, since it is astronomical data that is often used to help identify when events happened in the past, how the astronomers record time should be a useful cross reference. It turns out that while the question of 'where' has been resolved starting back in 2002 with a <a href="http://www.aanda.org/index.php?option=com_article&access=bibcode&Itemid=129&bibcode=2002A%2526A...395.1061GFUL">standard for World Coordinates</a>, currently the 'when' is still a standard under discussion. Of cause the problem of where is one which needs care since while we can probably get away with Longitude and Latitude on a perfect sphere, the Earth's surface deviates as much from that as the as much as the length of a solar day varies month on month. 'Very Long baseline array' radio telescopes probably show up just about every problem of providing an accurate time. The physical location of each antenna has to be accurately defined so that the physical distance from the base line of the direction being targeted gives a time offset for each received signal. Add to this the time taken to transmit data across what may well be several time zones to a central processor one understands why the recording process needs to be able to support a level of time accuracy far beyond simple history. Certainly one appreciates better why 128bit accuracy is almost essential in <a href="http://hea-www.cfa.harvard.edu/~arots/TimeWCS/WCSPaper-IV-v1.03A4.pdf">the draft specification</a>, but many of us are happy working to the nearest second ...</p>