[time-nuts] GPS message jitter (was GPS for Nixie Clock)

Scott Stobbe scott.j.stobbe at gmail.com
Mon Jul 18 15:51:44 UTC 2016


I suppose it is one of those cases where, the GPS designers decided you
shouldn't ever use the serial data for sub-second timing, and consequently
spent no effort on serial latency and jitter.

Most UARTs I have come across have been synthesized with a 16x baud clock
and included flow control. It would not have been too much effort to spec
latency as some mu ±100 ns and jitter of ±1/(16*baud).

For 9600 baud, the jitter on the start bit would be ±6.5 us.

If CTS was resampled a 1 full bit time (9600 baud), the jitter would
be ±104 us.

On Sat, Jul 16, 2016 at 3:13 PM, Mark Sims <holrum at hotmail.com> wrote:

> I just added some code to Lady Heather to record and plot the time that
> the timing message arrived from the receiver (well, actually the time that
> the screen update routine was called,  maybe a few microseconds
> difference).    I am using my existing GetMsec() routine which on Windoze
> actually has around a 16 msec granularity.  The Linux version uses the
> Linux nanosecond clock (divided down to msec resolution).  I just started
> testing it on a Ublox 8M in NMEA and binary message mode...  really
> surprising results to come shortly...
>
>
> _______________________________________________
> time-nuts mailing list -- time-nuts at febo.com
> To unsubscribe, go to
> https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
> and follow the instructions there.
>



More information about the Time-nuts_lists.febo.com mailing list