Jump to content

Coordinated Universal Time

Checked
Page protected with pending changes
From Wikipedia, the free encyclopedia
(Redirected from Temps Universel Coordonné)

Current time zones

Coordinated Universal Time (UTC) is the primary time standard globally used to regulate clocks and time. It establishes a reference for the current time, forming the basis for civil time and time zones. UTC facilitates international communication, navigation, scientific research, and commerce.

UTC has been widely embraced by most countries and is the effective successor to Greenwich Mean Time (GMT) in everyday usage and common applications.[a] In specialized domains such as scientific research, navigation, and timekeeping, other standards such as UT1 and International Atomic Time (TAI) are also used alongside UTC.

UTC is based on TAI, which is a weighted average of hundreds of atomic clocks worldwide. UTC is within about one second of mean solar time at 0° longitude, the currently used prime meridian, and is not adjusted for daylight saving time.

The coordination of time and frequency transmissions around the world began on 1 January 1960. UTC was first officially adopted as a standard in 1963 and "UTC" became the official abbreviation of Coordinated Universal Time in 1967.[2] The current version of UTC is defined by the International Telecommunication Union.

Since adoption, UTC has been adjusted several times, notably adding leap seconds in 1972. Recent years have seen significant developments in the realm of UTC, particularly in discussions about eliminating leap seconds from the timekeeping system because leap seconds occasionally disrupt timekeeping systems worldwide. The General Conference on Weights and Measures adopted a resolution to alter UTC with a new system that would eliminate leap seconds by 2035.[3]

Etymology

[edit]

The official abbreviation for Coordinated Universal Time is UTC. This abbreviation comes as a result of the International Telecommunication Union and the International Astronomical Union wanting to use the same abbreviation in all languages.[4] The compromise that emerged was UTC,[5] which conforms to the pattern for the abbreviations of the variants of Universal Time (UT0, UT1, UT2, UT1R, etc.).[6]

McCarthy described the origin of the abbreviation:

In 1967 the CCIR adopted the names Coordinated Universal Time and Temps Universel Coordonné for the English and French names with the acronym UTC to be used in both languages. The name "Coordinated Universal Time (UTC)" was approved by a resolution of IAU Commissions 4 and 31 at the 13th General Assembly in 1967 (Trans. IAU, 1968).[2]

Uses

[edit]

Time zones around the world are expressed using positive, zero, or negative offsets from UTC, as in the list of time zones by UTC offset.

The westernmost time zone uses UTC−12, being twelve hours behind UTC; the easternmost time zone uses UTC+14, being fourteen hours ahead of UTC. In 1995, the island nation of Kiribati moved those of its atolls in the Line Islands from UTC−10 to UTC+14 so that Kiribati would all be on the same day.

UTC is used in many Internet and World Wide Web standards. The Network Time Protocol (NTP), designed to synchronise the clocks of computers over the Internet, transmits time information from the UTC system.[7] If only milliseconds precision is needed, clients can obtain the current UTC from a number of official internet UTC servers. For sub-microsecond precision, clients can obtain the time from satellite signals.

UTC is also the time standard used in aviation,[8] e.g. for flight plans and air traffic control. In this context it is frequently referred to as Zulu time, as described below. Weather forecasts and maps all use UTC to avoid confusion about time zones and daylight saving time. The International Space Station also uses UTC as a time standard.

Amateur radio operators often schedule their radio contacts in UTC, because transmissions on some frequencies can be picked up in many time zones.[9]

Mechanism

[edit]

UTC divides time into days, hours, minutes, and seconds. Days are conventionally identified using the Gregorian calendar, but Julian day numbers can also be used. Each day contains 24 hours and each hour contains 60 minutes. The number of seconds in a minute is usually 60, but with an occasional leap second, it may be 61 or 59 instead.[10] Thus, in the UTC time scale, the second and all smaller time units (millisecond, microsecond, etc.) are of constant duration, but the minute and all larger time units (hour, day, week, etc.) are of variable duration. Decisions to introduce a leap second are announced at least six months in advance in "Bulletin C" produced by the International Earth Rotation and Reference Systems Service.[11][12] The leap seconds cannot be predicted far in advance due to the unpredictable rate of the rotation of Earth.[13]

Nearly all UTC days contain exactly 86,400 SI seconds with exactly 60 seconds in each minute. UTC is within about one second of mean solar time (such as UT1) at 0° longitude,[14] (at the IERS Reference Meridian). The mean solar day is slightly longer than 86,400 SI seconds so occasionally the last minute of a UTC day is adjusted to have 61 seconds. The extra second is called a leap second. It accounts for the grand total of the extra length (about 2 milliseconds each) of all the mean solar days since the previous leap second. The last minute of a UTC day is permitted to contain 59 seconds to cover the remote possibility of the Earth rotating faster, but that has not yet been necessary. The irregular day lengths mean fractional Julian days do not work properly with UTC.

Since 1972, UTC may be calculated by subtracting the accumulated leap seconds from International Atomic Time (TAI), which is a coordinate time scale tracking notional proper time on the rotating surface of the Earth (the geoid). In order to maintain a close approximation to UT1, UTC occasionally has discontinuities where it changes from one linear function of TAI to another. These discontinuities take the form of leap seconds implemented by a UTC day of irregular length. Discontinuities in UTC occurred only at the end of June or December. However, there is provision for them to happen at the end of March and September as well as a second preference.[15][16] The International Earth Rotation and Reference Systems Service (IERS) tracks and publishes the difference between UTC and Universal Time, DUT1 = UT1 − UTC, and introduces discontinuities into UTC to keep DUT1 in the interval (−0.9 s, +0.9 s).

As with TAI, UTC is only known with the highest precision in retrospect. Users who require an approximation in real time must obtain it from a time laboratory, which disseminates an approximation using techniques such as GPS or radio time signals. Such approximations are designated UTC(k), where k is an abbreviation for the time laboratory.[17] The time of events may be provisionally recorded against one of these approximations; later corrections may be applied using the International Bureau of Weights and Measures (BIPM) monthly publication of tables of differences between canonical TAI/UTC and TAI(k)/UTC(k) as estimated in real-time by participating laboratories.[18] (See the article on International Atomic Time for details.)

Because of time dilation, a standard clock not on the geoid, or in rapid motion, will not maintain synchronicity with UTC. Therefore, telemetry from clocks with a known relation to the geoid is used to provide UTC when required, on locations such as those of spacecraft.

It is impossible to compute the exact time interval elapsed between two UTC timestamps without consulting a table showing how many leap seconds occurred during that interval. By extension, it is not possible to compute the precise duration of a time interval that ends in the future and may encompass an unknown number of leap seconds (for example, the number of TAI seconds between "now" and 2099-12-31 23:59:59). Therefore, many scientific applications that require precise measurement of long (multi-year) intervals use TAI instead. TAI is also commonly used by systems that cannot handle leap seconds. GPS time always remains exactly 19 seconds behind TAI (neither system is affected by the leap seconds introduced in UTC).

Time zones

[edit]

Time zones are usually defined as differing from UTC by an integer number of hours,[19] although the laws of each jurisdiction would have to be consulted if sub-second accuracy was required. Several jurisdictions have established time zones that differ by an odd integer number of half-hours or quarter-hours from UT1 or UTC.

Current civil time in a particular time zone can be determined by adding or subtracting the number of hours and minutes specified by the UTC offset, which ranges from UTC−12:00 in the west to UTC+14:00 in the east (see List of UTC offsets).

The time zone using UTC is sometimes denoted UTC+00:00 or by the letter Z—a reference to the equivalent nautical time zone (GMT), which has been denoted by a Z since about 1950. Time zones were identified by successive letters of the alphabet and the Greenwich time zone was marked by a Z as it was the point of origin. The letter also refers to the "zone description" of zero hours, which has been used since 1920 (see time zone history). Since the NATO phonetic alphabet word for Z is "Zulu", UTC is sometimes known as "Zulu time". This is especially true in aviation, where "Zulu" is the universal standard.[20] This ensures that all pilots, regardless of location, are using the same 24-hour clock, thus avoiding confusion when flying between time zones.[21] See the list of military time zones for letters used in addition to Z in qualifying time zones other than Greenwich.

On electronic devices which only allow the time zone to be configured using maps or city names, UTC can be selected indirectly by selecting cities such as Accra in Ghana or Reykjavík in Iceland as they are always on UTC and do not currently use daylight saving time (which Greenwich and London do, and so could be a source of error).[22]

Daylight saving time

[edit]

UTC does not change with a change of seasons, but local time or civil time may change if a time zone jurisdiction observes daylight saving time (summer time). For example, local time on the east coast of the United States is five hours behind UTC during winter,[23] but four hours behind while daylight saving is observed there.[24]

History

[edit]

In 1928, the term Universal Time (UT) was introduced by the International Astronomical Union to refer to GMT, with the day starting at midnight.[25] Until the 1950s, broadcast time signals were based on UT, and hence on the rotation of the Earth.

In 1955, the caesium atomic clock was invented. This provided a form of timekeeping that was both more stable and more convenient than astronomical observations. In 1956, the U.S. National Bureau of Standards and U.S. Naval Observatory started to develop atomic frequency time scales; by 1959, these time scales were used in generating the WWV time signals, named for the shortwave radio station that broadcasts them. In 1960, the U.S. Naval Observatory, the Royal Greenwich Observatory, and the UK National Physical Laboratory coordinated their radio broadcasts so that time steps and frequency changes were coordinated, and the resulting time scale was informally referred to as "Coordinated Universal Time".[26][27]

In a controversial decision, the frequency of the signals was initially set to match the rate of UT, but then kept at the same frequency by the use of atomic clocks and deliberately allowed to drift away from UT. When the divergence grew significantly, the signal was phase shifted (stepped) by 20 ms to bring it back into agreement with UT. Twenty-nine such steps were used before 1960.[28]

In 1958, data was published linking the frequency for the caesium transition, newly established, with the ephemeris second. The ephemeris second is a unit in the system of time that, when used as the independent variable in the laws of motion that govern the movement of the planets and moons in the solar system, enables the laws of motion to accurately predict the observed positions of solar system bodies. Within the limits of observable accuracy, ephemeris seconds are of constant length, as are atomic seconds. This publication allowed a value to be chosen for the length of the atomic second that would accord with the celestial laws of motion.[29]

The coordination of time and frequency transmissions around the world began on 1 January 1960. UTC was first officially adopted in 1963 as CCIR Recommendation 374, Standard-Frequency and Time-Signal Emissions, and "UTC" became the official abbreviation of Coordinated Universal Time in 1967.[2]

In 1961, the Bureau International de l'Heure began coordinating the UTC process internationally (but the name Coordinated Universal Time was not formally adopted by the International Astronomical Union until 1967).[30][31] From then on, there were time steps every few months, and frequency changes at the end of each year. The jumps increased in size to 0.1 seconds. This UTC was intended to permit a very close approximation to UT2.[26]

In 1967, the SI second was redefined in terms of the frequency supplied by a caesium atomic clock. The length of second so defined was practically equal to the second of ephemeris time.[32] This was the frequency that had been provisionally used in TAI since 1958. It was soon decided that having two types of second with different lengths, namely the UTC second and the SI second used in TAI, was a bad idea. It was thought better for time signals to maintain a consistent frequency, and that this frequency should match the SI second. Thus it would be necessary to rely on time steps alone to maintain the approximation of UT. This was tried experimentally in a service known as "Stepped Atomic Time" (SAT), which ticked at the same rate as TAI and used jumps of 0.2 seconds to stay synchronised with UT2.[33]

There was also dissatisfaction with the frequent jumps in UTC (and SAT). In 1968, Louis Essen, the inventor of the caesium atomic clock, and G. M. R. Winkler both independently proposed that steps should be of 1 second only.[34] to simplify future adjustments. This system was eventually approved as leap seconds in a new UTC in 1970 and implemented in 1972, along with the idea of maintaining the UTC second equal to the TAI second. This CCIR Recommendation 460 "stated that (a) carrier frequencies and time intervals should be maintained constant and should correspond to the definition of the SI second; (b) step adjustments, when necessary, should be exactly 1 s to maintain approximate agreement with Universal Time (UT); and (c) standard signals should contain information on the difference between UTC and UT."[35]

As an intermediate step at the end of 1971, there was a final irregular jump of exactly 0.107758 TAI seconds, making the total of all the small time steps and frequency shifts in UTC or TAI during 1958–1971 exactly ten seconds, so that 1 January 1972 00:00:00 UTC was 1 January 1972 00:00:10 TAI exactly,[36] and a whole number of seconds thereafter. At the same time, the tick rate of UTC was changed to exactly match TAI. UTC also started to track UT1 rather than UT2. Some time signals started to broadcast the DUT1 correction (UT1 − UTC) for applications requiring a closer approximation of UT1 than UTC now provided.[37][38]

The current version of UTC is defined by International Telecommunication Union Recommendation (ITU-R TF.460-6), Standard-frequency and time-signal emissions,[39] and is based on International Atomic Time (TAI) with leap seconds added at irregular intervals to compensate for the accumulated difference between TAI and time measured by Earth's rotation.[40] Leap seconds are inserted as necessary to keep UTC within 0.9 seconds of the UT1 variant of universal time.[41] See the "Current number of leap seconds" section for the number of leap seconds inserted to date.

Current number of leap seconds

[edit]

The first leap second occurred on 30 June 1972. Since then, leap seconds have occurred on average about once every 19 months, always on 30 June or 31 December. As of July 2022, there have been 27 leap seconds in total, all positive, putting UTC 37 seconds behind TAI.[42]

A study published in March 2024 in Nature concluded that accelerated melting of ice in Greenland and Antarctica due to climate change has decreased Earth's rotational velocity, affecting UTC adjustments and causing problems for computer networks that rely on UTC.[43]

Rationale

[edit]
Graph showing the difference DUT1 between UT1 and UTC (in seconds). Vertical segments correspond to leap seconds.

Earth's rotational speed is very slowly decreasing because of tidal deceleration; this increases the length of the mean solar day. The length of the SI second was calibrated on the basis of the second of ephemeris time[29][32] and can now be seen to have a relationship with the mean solar day observed between 1750 and 1892, analysed by Simon Newcomb. As a result, the SI second is close to 1/86400 of a mean solar day in the mid‑19th century.[44] In earlier centuries, the mean solar day was shorter than 86,400 SI seconds, and in more recent centuries it is longer than 86,400 seconds. Near the end of the 20th century, the length of the mean solar day (also known simply as "length of day" or "LOD") was approximately 86,400.0013 s.[45] For this reason, UT is now "slower" than TAI by the difference (or "excess" LOD) of 1.3 ms/day.

The excess of the LOD over the nominal 86,400 s accumulates over time, causing the UTC day, initially synchronised with the mean sun, to become desynchronised and run ahead of it. Near the end of the 20th century, with the LOD at 1.3 ms above the nominal value, UTC ran faster than UT by 1.3 ms per day, getting a second ahead roughly every 800 days. Thus, leap seconds were inserted at approximately this interval, retarding UTC to keep it synchronised in the long term.[46] The actual rotational period varies on unpredictable factors such as tectonic motion and has to be observed, rather than computed.

Just as adding a leap day every four years does not mean the year is getting longer by one day every four years, the insertion of a leap second every 800 days does not indicate that the mean solar day is getting longer by a second every 800 days. It will take about 50,000 years for a mean solar day to lengthen by one second (at a rate of 2 ms per century). This rate fluctuates within the range of 1.7–2.3 ms/cy. While the rate due to tidal friction alone is about 2.3 ms/cy, the uplift of Canada and Scandinavia by several metres since the last ice age has temporarily reduced this to 1.7 ms/cy over the last 2,700 years.[47] The correct reason for leap seconds, then, is not the current difference between actual and nominal LOD, but rather the accumulation of this difference over a period of time: Near the end of the 20th century, this difference was about 1/800 of a second per day; therefore, after about 800 days, it accumulated to 1 second (and a leap second was then added).

In the graph of DUT1 above, the excess of LOD above the nominal 86,400 s corresponds to the downward slope of the graph between vertical segments. (The slope became shallower in the 1980s, 2000s and late 2010s to 2020s because of slight accelerations of Earth's rotation temporarily shortening the day.) Vertical position on the graph corresponds to the accumulation of this difference over time, and the vertical segments correspond to leap seconds introduced to match this accumulated difference. Leap seconds are timed to keep DUT1 within the vertical range depicted by the adjacent graph. The frequency of leap seconds therefore corresponds to the slope of the diagonal graph segments, and thus to the excess LOD. Time periods when the slope reverses direction (slopes upwards, not the vertical segments) are times when the excess LOD is negative, that is, when the LOD is below 86,400 s.

Future

[edit]

As the Earth's rotation continues to slow, positive leap seconds will be required more frequently. The long-term rate of change of LOD is approximately +1.7 ms per century. At the end of the 21st century, LOD will be roughly 86,400.004 s, requiring leap seconds every 250 days. Over several centuries, the frequency of leap seconds will become problematic.[48] A change in the trend of the UT1 – UTC values was seen beginning around June 2019 in which instead of slowing down (with leap seconds to keep the difference between UT1 and UTC less than 0.9 seconds) the Earth's rotation has sped up, causing this difference to increase. If the trend continues, a negative leap second may be required, which has not been used before. This may not be needed until 2025.[49][50]

Some time in the 22nd century, two leap seconds will be required every year. The current practice of only allowing leap seconds in June and December will be insufficient to maintain a difference of less than 1 second, and it might be decided to introduce leap seconds in March and September. In the 25th century, four leap seconds are projected to be required every year, so the current quarterly options would be insufficient.

In April 2001, Rob Seaman of the National Optical Astronomy Observatory proposed that leap seconds be allowed to be added monthly rather than twice yearly.[51]

In 2022 a resolution was adopted by the General Conference on Weights and Measures to redefine UTC and abolish leap seconds, but keep the civil second constant and equal to the SI second, so that sundials would slowly get further and further out of sync with civil time. The leap seconds will be eliminated by 2035. The resolution does not break the connection between UTC and UT1, but increases the maximum allowable difference. The details of what the maximum difference will be and how corrections will be implemented is left for future discussions.[3] This will result in a shift of the sun's movements relative to civil time, with the difference increasing quadratically with time (i.e., proportional to elapsed centuries squared). This is analogous to the shift of seasons relative to the yearly calendar that results from the calendar year not precisely matching the tropical year length. This would be a change in civil timekeeping, and would have a slow effect at first, but becoming drastic over several centuries. UTC (and TAI) would be more and more ahead of UT; it would coincide with local mean time along a meridian drifting eastward faster and faster.[52] Thus, the time system will lose its fixed connection to the geographic coordinates based on the IERS meridian. The difference between UTC and UT would reach 0.5 hours after the year 2600 and 6.5 hours around 4600.[53]

ITU-R Study Group 7 and Working Party 7A were unable to reach consensus on whether to advance the proposal to the 2012 Radiocommunications Assembly; the chairman of Study Group 7 elected to advance the question to the 2012 Radiocommunications Assembly (20 January 2012),[54] but consideration of the proposal was postponed by the ITU until the World Radio Conference in 2015.[55] This conference, in turn, considered the question,[56] but no permanent decision was reached; it only chose to engage in further study with the goal of reconsideration in 2023.[57][needs update]

A proposed alternative to the leap second is the leap hour or leap minute, which requires changes only once every few centuries.[58]

ITU World Radiocommunication Conference 2023 (WRC-23), which was held in Dubai (United Arab Emirates) from 20 November to 15 December 2023 formally recognized the Resolution 4 of the 27th CGPM (2022) which decides that the maximum value for the difference (UT1-UTC) will be increased in, or before, 2035.[59]

See also

[edit]

References

[edit]

Notes

[edit]
  1. ^ The pips are no longer broadcast from Greenwich, but from the National Physical Laboratory in Teddington, Surrey, which uses Coordinated Universal Time (UTC) – the successor of GMT – for its reading.[1]

Citations

[edit]
  1. ^ Evers 2013, p. 74.
  2. ^ a b c McCarthy 2009, p. 4.
  3. ^ a b "Resolutions of the General Conference on Weights and Measures (27th Meeting)". Bureau International des Poids et Mesures. 19 November 2022. Archived from the original on 19 November 2022. Retrieved 19 August 2022.
  4. ^ SI Brochure (9th ed.). BIPM. 2019. French version. Retrieved 9 September 2023.
  5. ^ "Why is UTC used as the acronym for Coordinated Universal Time instead of CUT?". NIST Time Frequently Asked Questions (FAQ). National Institute of Standards and Technology, Time and Frequency Division. 3 February 2010. Archived from the original on 6 July 2011. Retrieved 17 July 2011.
  6. ^ IAU resolutions 1976.
  7. ^ How NTP Works 2011.
  8. ^ Aviation Time 2006.
  9. ^ Horzepa 2010.
  10. ^ ITU Radiocommunication Assembly 2002, p. 3.
  11. ^ International Earth Rotation and Reference Systems Service 2011.
  12. ^ McCarthy & Seidelmann 2009, p. 229.
  13. ^ McCarthy & Seidelmann 2009, chapter 4.
  14. ^ Guinot 2011, p. S181.
  15. ^ History of TAI-UTC c. 2009.
  16. ^ McCarthy & Seidelmann 2009, pp. 217, 227–231.
  17. ^ McCarthy & Seidelmann 2009, p. 209.
  18. ^ "Circular T". International Bureau of Weights and Measures. Archived from the original on 30 June 2022. Retrieved 17 June 2022.
  19. ^ Seidelmann 1992, p. 7.
  20. ^ Military & Civilian Time Designations n.d.
  21. ^ Williams 2005.
  22. ^ Iceland 2011.
  23. ^ 15 U.S. Code § 261 2007.
  24. ^ 15 U.S. Code § 260a 2005.
  25. ^ McCarthy & Seidelmann 2009, pp. 10–11.
  26. ^ a b McCarthy & Seidelmann 2009, pp. 226–227.
  27. ^ McCarthy 2009, p. 3.
  28. ^ Arias, Guinot & Quinn 2003.
  29. ^ a b Markowitz et al. 1958.
  30. ^ Nelson & McCarthy 2005, p. 15.
  31. ^ Nelson et al. 2001, p. 515.
  32. ^ a b Markowitz 1988.
  33. ^ McCarthy & Seidelmann 2009, p. 227.
  34. ^ Essen 1968, pp. 161–165.
  35. ^ McCarthy 2009, p. 5.
  36. ^ Blair 1974, p. 32.
  37. ^ Seidelmann 1992, pp. 85–87.
  38. ^ Nelson, Lombardi & Okayama 2005, p. 46.
  39. ^ ITU Radiocommunication Assembly 2002.
  40. ^ Chester 2015.
  41. ^ "How often do we have leap seconds?". NIST Time Frequently Asked Questions (FAQ). National Institute of Standards and Technology, Time and Frequency Division. 4 February 2010. Archived from the original on 12 August 2016. Retrieved 13 July 2017.
  42. ^ Bulletin C 2022.
  43. ^ Agnew, Duncan Car (27 March 2024). "A global timekeeping problem postponed by global warming". Nature. 628 (8007): 333–336. Bibcode:2024Natur.628..333A. doi:10.1038/s41586-024-07170-0. PMID 38538793.
  44. ^ McCarthy & Seidelmann 2009, p. 87.
  45. ^ McCarthy & Seidelmann 2009, p. 54.
  46. ^ McCarthy & Seidelmann 2009, p. 230. (Average for period from 1 January 1991 through 1 January 2009. Average varies considerably depending on what period is chosen.)
  47. ^ Stephenson & Morrison 1995.
  48. ^ McCarthy & Seidelmann 2009, p. 232.
  49. ^ "Are Negative Leap Seconds in Our Future?" (PDF) (Press release). US Naval Observatory. 10 February 2021. Retrieved 18 June 2022.
  50. ^ "Plots for UT1-UTC – Bulletin A All". International Earth Rotation and Reference Systems Service. 16 September 2021. Archived from the original on 23 October 2021. Retrieved 16 September 2021.
  51. ^ Seaman, Rob (9 April 2001). "Upgrade, don't degrade". Archived from the original on 2 June 2013. Retrieved 10 September 2015.
  52. ^ Irvine 2008.
  53. ^ Allen 2011a.
  54. ^ Seidelmann & Seago 2011, p. S190.
  55. ^ Leap decision postponed 2012.
  56. ^ "ITU World Radiocommunication Conference set for Geneva, 2–27 November 2015" (Press release). International Telecommunication Union. 2015. Retrieved 3 November 2015.
  57. ^ "Coordinated Universal Time (UTC) to retain "leap second"". itu.int (Press release). Retrieved 12 July 2017.
  58. ^ "Scientists propose 'leap hour' to fix time system". The New Indian Express. 14 May 2012. Archived from the original on 3 September 2022. Retrieved 3 September 2022.
  59. ^ BIPM

General and cited sources

[edit]
[edit]