I caught myself spending more than a minute googling for a definition of "epoch" trying to explain an epoch-based timestamp-field for a colleague. For future reference:
epoch n.In other words, an epoch based timestamp makes it easy to calculate time differences (take timestamp A minus timestamp B and voila...), and is historically widely(?) used in the UN*X world.[Unix: prob. from astronomical timekeeping] The time and date corresponding to 0 in an operating system's clock and timestamp values. Under most Unix versions the epoch is 00:00:00 GMT, January 1, 1970; under VMS, it's 00:00:00 of November 17, 1858 (base date of the U.S. Naval Observatory's ephemerides); on a Macintosh, it's the midnight beginning January 1 1904. System time is measured in seconds or ticks past the epoch. Weird problems may ensue when the clock wraps around (see wrap around), which is not necessarily a rare event; on systems counting 10 ticks per second, a signed 32-bit count of ticks is good only for 6.8 years. The 1-tick-per-second clock of Unix is good only until January 18, 2038, assuming at least some software continues to consider it signed and that word lengths don't increase by then. See also wall time. Microsoft Windows, on the other hand, has an epoch problem every 49.7 days - but this is seldom noticed as Windows is almost incapable of staying up continuously for that long.
[Source: The Jargon File: epoch]
(Side note: I wonder when someone will turn the Jargon File into a Hacker/Jargon-Wiki...)
More fun stuff:
They are not all UTC. In particular then I believe the Mac OS (Classic) epoch is in localtime. Yay.
- ask
That's just plain surprising -- a time-zone dependent epoch sounds like .... uh.... a contradiction in terms? :-)
Posted by: Anders Jacobsen on February 28, 2003 01:49 PM
©
Anders Jacobsen [extrospection.com photography] |