[time-nuts] ADEV and Tau-0 guestion

Don @ True-Cal True-Cal at swbell.net
Sat Nov 1 21:29:26 UTC 2008


Fellow Time-Nuts,

I am having great fun with Ulrich's EZGPIB and Plotter programs to automate my ADEV and TI measurements. Wow, what a nice set of programs, thanks Ulrich!

I use the SR620 TIC with a Fury board as an external reference. The Fury disciplines an 10811-60168 external oscillator. I can go unlocked to improve the range around Tau 100s if and when necessary. For a series of tests, I used an LPRO-101 10Mhz signal to drive B-Ch (Stop) of the SR620; the A-Ch (Start) was set to Ref. for a Zero-Crossing TIME measurement on the TIC. I streamlined the EZGPIB SR620 query program and experimented with counter settings to minimize the inevitable and inherent latencies of the computer layers, network, GPIB-Enet/100 bridge and the counter (counter being the worst). With the counter set to 100 samples and the 1KHz "Ref" being used as the START, I was expecting a new, 100 sample TI average, every 0.1 seconds. My first evidence of something not being ideal was embedded in the details of the EZGPIB output console and accompanying file. Sometimes there were 7, 8 or 9 samples per second of time and never 10. Also, the total time span of a large collection of samples was always slightly longer than the product of the sample rate and count. I used Excel to scan 18000, 0.1s TI samples to determine what the actual statistics might be:

Average = 0.122302796 sec
Min = 0.188015099 sec
Max = 0.108984648 sec

Since the ADEV function as well as Ulrich's Plotter program requires a constant Tau-0, I experimented with the nominal 0.1s and the real "average" of 0.1223s Tau-0 setting and attached a graph that illustrates the variance across Tau. My question is; what is "acceptable" practice for defining Tau-0 when the likelihood of having a stable sampling interval is difficult. It was rather simple to specify a more accurate time sample interval once determined by the extra step of spreadsheet analysis and the effect on the results is obvious. But that is still, only an average. What about the effect of the deviation about the average value? It would seem that would be a much more complex issue to deal with.

See attached export or Plotter graphic.

Regards...
Don
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Tau-0 Correction Variance.pdf
Type: application/pdf
Size: 31544 bytes
Desc: not available
URL: <http://febo.com/pipermail/time-nuts_lists.febo.com/attachments/20081101/025876d4/attachment.pdf>


More information about the Time-nuts_lists.febo.com mailing list