[time-nuts] Are there limits to the accuracy of clocks?

Bill Hawkins bill at iaxs.net
Thu Mar 30 22:40:05 UTC 2006


"While we have been discussing stuff that is about 10E-20 below
what we can measure, is it possible that the limit of resolution
of the measurement of time is determined by noise? Specifically,
thermal or Johnson noise in those 50 ohm impedances that we
use for our cabling."

This didn't provoke any response, so maybe you think I'm crazy.
You could still be right, but let me explain.

We are talking about the accuracy of clocks, which are basically
digital counters that are mostly immune to noise. You also need
an oscillator that generates something that a clock can count.

Noise gets into it at the interface between the oscillator and
the clock. Assuming electronics (because mechanical oscillators
have too many sources of error) then something electronic has to
detect the periodic oscillations and turn them into pulses for
the clock's first counter/divider. Noise in the detector will
cause jitter in the clock.

The problem can be reduced by increasing the signal to noise
ratio. Can amplifiers be made noise-free? Certainly not in
radio receivers. We are talking about accuracy better than
10E-15. Johnson noise in 50 ohms is around 10E-8 volt at 100C
with a one MHz bandwidth. You would need a signal of around
10E+7 volts to swamp the noise.

Precision oscillators are cryogenic these days, which could
eliminate thermal noise.

Truly random noise can be reduced by filtering or averaging
over time. The amount of time available depends on the use of
the clock. A time-of-day clock should have no error from noise.

Regards,
Bill Hawkins





More information about the Time-nuts_lists.febo.com mailing list