[time-nuts] PC time app
dennis.c.ferguson at gmail.com
Sat Nov 26 17:35:21 EST 2011
On 25 Nov, 2011, at 21:56 , Steve . wrote:
> I'm curious as to what folks are doing with PC's that require micro second
> accuracy for days or weeks or what have you.
> Any examples?
I have a PCI-X board with an FPGA which implements a clock running
at 320 MHz. The 320 MHz can be phase-locked to an external 5 or 10 MHz
frequency input, and the card also has 4 PPS inputs. A transition on
a PPS input causes the FPGA to record a timestamp, with a precision of
not quite 3 ns, and deliver it to software via an interrupt. The 10 MHz
and PPS outputs from my GPS receiver are synchronous, so once the board
clock is set it keeps the time of the GPS receiver without any further
The system (the OS is NetBSD, but with the kernel timekeeping replaced) computes
its time as a linear function of the CPU's cycle counter, which on my machines
seems to run at a constant 2.4 GHz. I can get a sample timestamp (actually
a pair for them, the board->computer time comparison mechanism is the trickiest
part of the design) from the FPGA by doing a load from a card register, so
an 'rdtsc; load; rdtsc' gives me a sample offset between the computer's clock
and the card's clock with a constant systematic error which (arguably) should
be less than +/- 10 ns and with the board's precision of about 3 ns.
I get sample offsets at randomly jittered intervals which average to about
0.25 seconds, so I get about 4 offsets per second with about 3 ns of round-off
noise. The processing of these reduces to a linear least squares fit (the y-value
is the offset, the x-value is the time of the sample with respect to the
computer's clock) after some sanity filtering. The least squares fit gives
me a frequency error and a time offset error, along with confidence intervals
for each. I adjust the computer's clock when either the frequency error or
the time offset becomes non-zero with 80% confidence.
Typically I find the result of this to be, very roughly, a clock adjustment
every 10 seconds, with a frequency adjustment on the order of 10^-9 and a
time adjustment on the order of 10 ns. This is not perfectly reliable, of course;
if I leave the cover off the computer and cold-spray the computer's innards I can
drive the clock crazy, so it depends on temperature variations inside the case
being modest, or at least occurring relatively slowly compared to my offset sample
rate. When left alone in a rack in a quiet room, however, I seldom see anything
bad happening, so I think it isn't dangerous to assert that the arrangement is
typically keeping the computer's clock within +/- 20 ns of the GPS receiver, with
worst case excursions being no worse than maybe +/- 50 ns.
This has a number of uses, but is particularly good for NTP and PTP development.
You can use a board in server synchronize the server's system clock to a GPS
receiver, and then use a board tracking the same GPS receiver in a client machine
to independently measure how well the software is managing the client machine's
clock. This avoids having the NTP or PTP software grade its own homework.
More information about the time-nuts