Device That Revolutionized Timekeeping Receives an IEEE Milestone

The atomic clock, invented in 1948, paved the way for GPS

11 August 2017

The invention of the atomic clock fundamentally altered the way that time is measured and kept. The clock helped redefine the duration of a single second, and its groundbreaking accuracy contributed to technologies we rely on today, including cellphones and GPS receivers.

Building on the accomplishments of previous researchers, Harold Lyons and his colleagues at the U.S. National Bureau of Standards (now the National Institute of Standards and Technology), in Washington, D.C., began working in 1947 on developing an atomic clock and demonstrated it to the public two years later. Its design was based on atomic physics. The clock kept time by tracking the microwave signals that electrons in atoms emit when they change energy levels.

This month the atomic clock received an IEEE Milestone. Administered by the IEEE History Center and supported by donors, the milestone program recognizes outstanding technical developments around the world.


For thousands of years the reference for timekeeping was the Earth’s rotation rate—which was limited in accuracy. In the 1920s the quartz crystal oscillator circuit was invented. It kept time according to the mechanical resonance of vibrating crystals of piezoelectric material—which created electrical signals with a precise frequency. The circuits were accurate enough to measure and record variations in the Earth’s rotation, but they were still limited in performance and sensitive to environmental changes.

Physicist James Clerk Maxwell was perhaps the first to recognize that atoms could be used to keep time. In 1879 he wrote to electricity pioneer William Thomson, suggesting that the “period of vibration of a piece of quartz crystal” would be a better absolute standard of time than the mean solar second (based on the Earth’s rotation) but would still depend “essentially on one particular piece of matter” and therefore would be “liable to accidents.” Maxwell theorized that atoms would work even better as a natural standard of time. Thomson wrote in the second edition of the Elements of Natural Philosophy, published in 1879, that hydrogen atoms, sodium atoms, and others were “absolutely alike in every physical property” and “probably remain the same so long as the particle itself exists.”

Atomic clock experiments didn’t begin until nearly 60 years after the correspondence between Maxwell and Thomson. Early experiments in the 1930s and 1940s were made possible by rapid advances in quantum mechanics and microwave electronics.

Most of the concepts that led to atomic clocks were developed by physicist Isidor Isaac Rabi and his colleagues at Columbia in the 1930s. Rabi in 1939 informally discussed with scientists at the National Bureau of Standards his idea of applying the team’s molecular beam magnetic resonance technique as a time standard. He first measured a cesium atom’s resonance frequency in 1940, estimating the frequency at 9191.4 megacycles. That was close to the number that later would define the second. For his work, Rabi received the 1944 Nobel Prize in physics. But atomic clock research at Columbia was halted during World War II.


Lyons and his NBS colleagues began designing their atomic clock in 1947 and had a working prototype by 1948. Based on the frequency of the microwaves emitted by the ammonia molecule, the clock was not accurate enough to be used as a time standard, but it did prove the concept. In 1955 Louis Essen, a physicist at the U.K. National Physical Laboratory, outside London, built the first atomic clock accurate enough to be a time standard.

Before atomic clocks, the second was defined by dividing astronomical events, such as the solar day or the tropical year, into smaller parts. That changed in 1967, when the second was redefined as the duration of 9,192,631,770 energy transitions of the cesium atom. The new definition meant that seconds were measured by counting atomic oscillations, and minutes and hours became multiples of the second rather than divisions of the day.

Modern atomic clocks first cool the atoms to near absolute zero by slowing them with lasers and probing them in atomic fountains in a microwave-filled cavity. An example is the U.S. NIST-F1 atomic clock.


Lyons’ clock was honored on 8 August with ceremonies at the National Institute of Standards and Technology, in Gaithersburg, Md., and at the site of the former NBS building, in Washington, D.C. Speakers at the Gaithersburg ceremony included IEEE Senior Member Kent Rochford, acting NIST director; Senior Member Tony Ivanov, chair of the IEEE Washington (D.C.) Section; Member Steven Jefferts, an NIST physicist; and Fellow Clark Nguyen, president of the IEEE Ultrasonics, Ferroelectrics, and Frequency Control Society.

The two identical plaques read:

The first atomic clock, developed near this site by Harold Lyons at the National Bureau of Standards, revolutionized timekeeping by using transitions of the ammonia molecule as its source of frequency. Far more accurate than previous clocks, atomic clocks quickly replaced the Earth’s rotational rate as the reference for world time. Atomic clock accuracy made possible many new technologies including the Global Positioning System (GPS).

This article was written with assistance from the IEEE History Center, which is partially funded by donations to the IEEE Foundation.

Learn More