Correcting Atomic Time, from 1 second every 3 billion years to 1 in every 80 billion!

Thread: Correcting Atomic Time, from 1 second every 3 billion years to 1 in every 80 billion!

Results 1 to 5 of 5
  1. #1
    Member JohnnyBlazE's Avatar
    Join Date
    Jul 2008
    London, UK

    Correcting Atomic Time, from 1 second every 3 billion years to 1 in every 80 billion!

    Yup, our atomic timekeeping right now just ISN'T GOOD ENOUGH!

    Here's how we can make it better...!

    (That) frequency is used to mark out time to an accuracy of better than 1 part in 1017, or 1 second in 3 billion years.

    That's pretty good, but it could be better. Infrared photons emanating from the background cause the two energy levels to shift by slightly different amounts, says Marianna Safronova at the University of Delaware. That affects the frequency of the emitted radiation to an unknown extent, adding a small uncertainty to the clock's tick.

    Safronova reported this month at a conference in Baltimore, Maryland, that by combining two different mathematical approaches, she and her colleagues have now managed to calculate how much the energy gap between the two levels changes.

    Using this information to correct an atomic clock could in principle increase its precision to around 4 parts in 1019, or about 1 second per 80 billion years. Such a clock could test whether the fundamental constants of nature are changing, Safronova suggests.

    Super-accurate atomic clocks emerge from 'heat haze' - 5/16/2011 - Electronics Weekly

  2. #2
    Member xevious's Avatar
    Join Date
    Feb 2008
    near Manhattan... Status: Changing course, hard about

    Re: Correcting Atomic Time, from 1 second every 3 billion years to 1 in every 80 billion!

    This makes sense for scientific applications, where extreme precision is necessary. Although it would be interesting to hear about what kinds of applications it would make a difference. For the consumer, it's meaningless.

    It does make me wonder, though... how it was before the advent of atomic clocks (first examples were made in the 1950's). What was the base reference to calibrate the first clock? At what point could they say "NOW" was the start of 00:00:00 GMT on a given day? I imagine there must be a solar component to this, a means of determining the absolute position of the Earth in orbit around the sun and rotation about the axis.

    Quote Originally Posted by Wikipedia
    In August 2004, NIST scientists demonstrated a chip-scaled atomic clock. According to the researchers, the clock was believed to be one-hundredth the size of any other. It was also claimed that it requires just 75 mW, making it suitable for battery-driven applications. This device could conceivably become a consumer product.
    Last edited by xevious; May 16th, 2011 at 20:34.
    In rotation: Citizen Attesa ATV53-2834, Eco Drives | Omega Seamaster | CASIO: TW-7000, MRG-220, RevMan, G-2000D, DW-5700ML, GW-9100 | Seiko SKA-413, SBPG001

  3. #3
    Member tribe125's Avatar
    Join Date
    Mar 2006

    Re: Correcting Atomic Time, from 1 second every 3 billion years to 1 in every 80 billion!

    Interesting question, xevious. Can't answer it, of course...
    I used to list my watches here until I realised it ruined people's Google searches...

  4. Remove Advertisements

  5. #4
    Member ronbo's Avatar
    Join Date
    Jun 2006

    Re: Correcting Atomic Time, from 1 second every 3 billion years to 1 in every 80 billion!

    Radio clock time keeping kept synch prior to atomic I believe. Article here

  6. #5
    Member GatorJ's Avatar
    Join Date
    Feb 2006

    Re: Correcting Atomic Time, from 1 second every 3 billion years to 1 in every 80 billion!

    Before standardized time zones, local time was kept by communities primarily based on "high" noon, when the sun was the highest in the sky. Standardized time zones were created in large part due to the railroads.

    I found this history of the various clocks used as the official time standard on the NIST website:

    "The very earliest NIST work in the area of time and frequency took place within the Weights and Measures Section in Washington. One of the first objectives was the testing of watches and other timekeeping apparatus. In 1904 NIST purchased a very stable pendulum clock, the Riefler Clock (left) from Clemens Riefler in Germany. This clock served as a time interval standard until 1929 when it was replaced by the Shortt Clock (right), a double pendulum clock developed at Edinburgh Observatory and fabricated in London. This mechanical standard was replaced only a few years later by standards based on electronic methods. NIST's initial involvement in electronic time-and-frequency methods was spurred by problems encountered in the early days of radio broadcasting in the United States. Commercial radio broadcasters were having difficulty keeping their broadcasts "on frequency" because they lacked adequate frequency standards. This requirement launched NBS into the development of inductance-capacitance wavemeters and then Quartz-Crystal Frequency Standards.
    In 1923, in order to meet the growing needs of the broadcast industry, NIST initiated radio broadcasts of frequency signals that continue to this day, although these now include time information as well. NIST Radio Broadcasts originated from a series of locations initially on the East Coast and later in Colorado. In the process of restating and expanding the mission of NIST in 1950, the Congress recognized the importance of this activity by including the function of "broadcasting of radio signals of standard frequency." Current NIST radio signals now emanate from broadcast stations WWV and WWVB located just north of Fort Collins, Colorado and from WWVH (right) located on the island of Kauai, Hawaii. NIST also offers two services designed to synchronize computer clocks and other automated equipment at modest accuracy levels: the Automated Computer Time Service (ACTS), and the rapidly growing Internet Time Service.
    A major event in the NIST history was the development of the first Atomic Clock (right) in 1949. This atomic clock was based on an absorption line in the ammonia molecule. An atomic clock is really just a frequency standard in which a running count of oscillations is recorded. This distinction between atomic clock and atomic frequency standard is analogous to the pendulum clock where the pendulum frequency standard is used to drive an escapement mechanism that keeps track of the ticks thus producing a clock. From the very beginning of this program, it was believed that atomic beam methods offered the best approach to an atomic frequency standard, and indeed the next seven Atomic Frequency Standards produced by NIST were based on beams of cesium atoms. This and similar work elsewhere in the world eventually led in 1967 to a an international redefinition of the second as "the duration of 9 192 631 770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium-133 atom. Improvements in NIST Frequency Standards over the next five decades proceeded at a rapid rate (accuracy improvement of better than an order of magnitude every ten years) culminating in today's standard, NIST-F1, with an accuracy of less than one second in 30 million years (less than 1 part in 1015)."

    History of the NIST Time and Frequency Division

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts