The "atomic second" turns 50
The "atomic second" was the beginning of a revolutionary era: it was born as early as 1955, when the first cesium atomic clock was put into operation. In the fall of 1967, it was included in the International System of Units. This was the beginning of a development which will, in all likelihood, come to an end in the fall of 2018 when the 20th General Conference on Weights and Measures (CGPM) decides that the entire International System of Units (SI) is to be based on invariable properties of nature - on fundamental constants. In this development the second came next to the meter, but in the race for accuracy it has an outstanding role: no other unit can be realized with such accuracy. Today's cesium atomic clocks - such as the four primary clocks of the Physikalisch-Technische Bundesanstalt (PTB), which are responsible for realizing and disseminating legal time in Germany -provide the time unit with the unimaginable accuracy up to 16 decimal places!
"You are giving us a beautiful topic to meditate about: measuring the trajectory of the stars in the infinite depth of space based on the oscillation of an infinitesimally small atom." This is how poetically the then French foreign minister, Couve de Murville, expressed what was about to happen in Paris. In 1967, the scientists and politicians gathered in Paris for the 13th General Conference on Weights and Measures (CGPM) decided to re-define the second. The decision fell on 13 October 1967: "The second is the duration of 9 192 631 770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium 133 atom." Thus, the second was then defined based on an atomic quantum transition. The large number in the definition stands for slightly more than 9 billion oscillations per second - a particular frequency of microwave radiation which triggers exactly this quantum transition in the outermost electron of the electron shell of the cesium atom. This definition is still valid today. And it will not need modifying, not even for the transition to the "new SI" which is planned to happen in the fall of 2018. The formulation of the 50-year-old definition was apparently clear-sighted and sustainable.
From a historical viewpoint, time - besides length and weight - is certainly one of the most important quantities. Although both philosophers and physicists find it difficult to define exactly what time exactly is, it is part and parcel of everyday life. Time has been measured for several millennia, first via the movement of the Earth against the Sun and fixed stars. This made it possible to determine time everywhere in a simple way. For a long time, this was much more important than being able to measure it precisely. It is only with the era of industrialization that this changed and led to an enormous acceleration in the history of time measurement. The "timeline of time measurement" can be summarized as follows: a few millennia of sundials, a few centuries of pendulum clocks, roughly one century of quartz clocks and five decades of atomic clocks - which have become more accurate with every decade.
One of the first suggestions how to measure physical units in a modern way came from physicist James Clerk Maxwell: Back in 1870 he said that one should not rely on the measures provided by the Earth, such as the length of a day for the second and the Earth's perimeter for the meter, but rather on fundamental constants in order to define the physical units. A suitable fundamental constant for the measurement of time was noted in 1940 by American physicist Isidor Isaac Rabi, namely the frequency of the transition between two selected states of an atom. In his opinion, the hyperfine structure states in the 133Cs atom (a non-radioactive isotope of cesium) were particularly well-suited for this purpose. He was awarded the Nobel prize in 1944, and his statement that "radio frequencies in hearts of atoms would be used in most accurate of time-pieces" was published in a prominent spot of the 21 January 1945 issue of the New York Times.
Ten years later, in 1955, the first cesium atomic clock was "ticking" at the National Physical Laboratory (NPL) in the UK. At the same time, preparations for what was to become the International System of Units in 1960 were in full swing. However, the scientists involved in the committees of the Metre Convention did not yet fully trust the brave new world of the second. For the time being, time remained the domain of astronomers: in 1960, the so-called "ephemeris second" was defined - which, with hindsight, was not particularly useful. In the fall of 1967, however, those scientists who believed in the future of atomic clocks succeeded in prevailing. From then on, the "modern" definition of the second applied.
At PTB, the first "home-made" cesium atomic clock started "ticking" in 1969. Three additional clocks have joined it since. PTB has become one of the leading "time makers"; its cesium atomic clocks make a large contribution to generating the worldwide reference time. But also the next generation of clocks - the so-called optical atomic clocks - are already showing their huge potential at PTB; but this could be the topic of a press release for an anniversary in ten years' time.