[time-nuts] GPSDO with all-digital phase/time measurement?
Tom Van Baak
tvb at LeapSecond.com
Wed Feb 26 19:24:38 EST 2014
Not to worry. It's not really "too coarse". Your method is more than enough for millisecond or microsecond timing. Consider that everything about time & frequency is merely an exponent; avoid fuzzy words like coarse or fine. Your approach will work just fine; many a GPSDO has been designed that way.
If you measure your oscillator to a second using WWVB and make adjustments, or if you measure your oscillator to a millisecond using NTP and make adjustments, or if you measure your oscillator to a microsecond using GPS and make adjustments, it is really all the same thing. The exponent is different but the concept is the same.
The final result depends on the time quality of the external 1PPS, the frequency quality of your oscillator, the resolution quality of your measurement, and the granularity of your adjustment. So every GPSDO "works", the question is simply how well does it work? And does it meet your needs? For many applications 1 ms or 1 us timing accuracy is more than enough. For only a few applications does 100 ns or 50 ns matter. It gets much, much harder as you get into the double and single ns digit range.
Several members on this list have developed GPSDO that are based on a MCU timestamping of 1PPS pulses. It works fine. Perfecting the tuning constants takes some time, and is somewhat dependent on your GPS receiver, your antenna, your location. Your oscillator, environment, and MCU also play a key part. But the art is to be able to *measure* all of this, to document it, to experiment, and to possibly improve on it over time.
----- Original Message -----
From: "Mark Haun" <haunma at keteu.org>
To: <time-nuts at febo.com>
Sent: Wednesday, February 26, 2014 1:51 PM
Subject: [time-nuts] GPSDO with all-digital phase/time measurement?
> Hi everyone,
> I'm new to the list, and have been reading the recent threads on
> Arduino-based GPSDOs and the pros/cons of 10-kHz vs 1-Hz time pulses with
> As I understand it, there are a couple of reasons why one needs a
> time-interval / phase measurement implemented outside the MCU:
> 1) Time resolution inside the MCU is limited by its clock period, which is
> much too coarse. The GPSDO would ping-pong within a huge dead zone.
> 2) Software tends to inject non-determinism into the timing.
> Are there others? I have no background or experience with PLLs/DLLs, so
> I'm really just feeling my way blindly here.
> That being said, I find myself wondering as follows:
> Suppose that we count OCXO cycles (at, say, 10 MHz) using one of the MCU's
> timer/counter peripherals, and periodically sample the counter value with an
> interrupt triggered on the rising edge of the GPS 1pps. Assume that this
> interrupt is the highest priority in the system, so that our measurement is
> fully deterministic, having only the +/- one cycle ambiguity inherent in the
> counting. Also assume that we keep the counter running continuously.
> At this point the time measurement is quite crude, with 100-ns resolution.
> But because we keep the counter running, the unknown residuals will keep
> accumulating, and we should be able to average out this "quantization noise"
> in the long run. That is, we can measure any T-second period to within 100
> ns, so the resolution on a per-second basis becomes 100 ns / T.
> Is there any reason why this sort of processing cannot attain equivalent
> performance to the more conventional analog phase-detection approach?
More information about the time-nuts