Data Center.com

epoch

By TechTarget Contributor

What is an epoch?

In a computing context, an epoch is the date and time relative to which a computer's clock and timestamp values are determined. The epoch traditionally corresponds to 0 hours, 0 minutes and 0 seconds, or 00:00:00, Coordinated Universal Time (UTC) on a specific date, which varies from system to system. Most versions of Unix, for example, use January 1, 1970 as the epoch date; Windows uses January 1, 1601; Macintosh systems use January 1, 1904; and Digital Equipment Corporation's Virtual Memory System uses November 17, 1858.

The date and time in a computer are determined according to the number of seconds or clock ticks that have elapsed since the defined epoch for that computer or platform. This number is limited by the word length and by the number of clock ticks per second. In a 32-bit computer with one tick per second, for example, the clock will wrap around (reach its maximum numerical time) on January 18, 2038. This is not likely to be a problem because most computers are obsolete after a few years. Longer word lengths and/or new epochs will likely be defined before 2038. However, in a computer with 16 ticks per second, wrap-around can occur well within the useful lifetime of the machine.

Other ways "epoch" is used

In geology, an epoch is a variable length of time that is a portion of a period, which is a fractional part of an era. We are currently in the Holocene epoch of the Quaternary period, which is part of the Cenozoic era.

In prediction of tides, an epoch is a period of 19 years, representing one complete cycle of all possible alignments of the sun and the moon.

In astronomy, an epoch is the point in time where a calendar or a defined time frame within a calendar is considered to begin. In 1984, the International Astronomical Union decided that epoch 2000.0 would begin at 1200 UTC on January 1, 2000.

23 Apr 2024

All Rights Reserved, Copyright 2000 - 2024, TechTarget | Read our Privacy Statement