[time-nuts] Cesium vs H Maser clocks

Tom Van Baak tvb at LeapSecond.com
Sat Nov 29 06:30:33 UTC 2008


> Isn't the temperature the _only_ thing to correct for?

No, not at all. Read the links that I provided to see that a real
cesium standard is not quite so simple, at least when you get
down to the ten to the -13, -14, -15 levels. At that level there
are all sorts of cool things that push or pull the frequency and
need to be corrected for.

> The definition of the second is "...the duration of 9 192 631 770 
> periods of the radiation corresponding to the transition between the 
> two hyperfine levels of the ground state of the cesium 133 atom." (and 
> affirmed by the CIPM in 1997 that this definition refers to a cesium 
> atom in its ground state at a temperature of 0 K)

Right. But realize that most of the cesium standards that we use
are not running at 0 Kelvin. So there is a correction for that.

In order for cesium beam standards to even work, one must apply
a slight magnetic field, the so-called C-field, which rather strongly
distorts the shape of the resonance peak. The definition assumes
zero magnetic field, so this too must be modeled and corrected for.
That's why, for example, the hp 5062c runs at 9,192,631,774.3133 Hz,
not the textbook 9,192,631,770 Hz. An internal synthesizer takes
care of this correction.

The NIST papers list a dozen or so of these corrections, each of
which is a nice lesson in atomic physics by itself.

Note also that clocks at NIST run about 1.8e-13 fast due to the high
elevation of Boulder, CO (general relativity), which is yet another
factor that has to be corrected for compared to the official sea-level
definition of the second.

> That other factors can change the relative frequency of different Cs 
> clocks is a problem with the definition, not an indication that any 
> particular one is better than another. If a magnetic field changes the 
> relative frequency, but that isn't reflected in the definition, is it 
> not the definition which is faulty, and not the timepiece? The second 
> is imprecise in this regard. 

The definition is fine -- it applies to the ideal conditions. But if you
decide to build an apparatus to implement the definition, and if for
whatever reason the ideal conditions can't be met in your apparatus,
then is it up to you, the clock builder, to anticipate this and make
corrections for it so that your clock still counts SI seconds at the
output BNC connector.

The other thing to note is that most cesium standards come with a
specification, based on design. I don't have the exact numbers but
a 5061A might be accurate out-of-the-box to 1e-11 while a 5071A
might be accurate to 1e-13. This reflects the difference in design,
manufacturing tolerances, and the number of internal frequency
offsets that are controlled or compensated for in hardware or in
firmware. So the definition of the SI second is fine; it's just that
some clocks can get closer to realizing the definition than others.

/tvb





More information about the Time-nuts_lists.febo.com mailing list