[time-nuts] Re: Timestamping counter techniques : dead zone quantification
Erik@tinySA
erik at kaashoek.com
Mon Feb 7 15:57:34 UTC 2022
Tom,
Thanks, the concept of de-trending is understood and the first
sub-sample is a very good estimate of the trend but its a bit
frustrating not to understand how to implement in integer math the
regression sums when the x interval of the subsamples is not constant as
the capture moment depends on the incoming signal edges and one can not
predict when time timer capture interrupt arrives.
In a simulation using actual sub-sample data, when de-trending y with a
constant value assuming a constant X interval the standard error
increases from around 1e-11 to 1e-9 so this variation in x interval
seems big enough to matter.
Only when doing the sub-sample calculations in float I can understand
how to de-trend but floats have insufficient accuracy and doubles will
become way to slow..
So I still need to do a lot of reading....
Erik.
On 7-2-2022 16:05, Tom Van Baak wrote:
> Erik,
>
> The hp 53132A counter was mentioned in an earlier posting. Check the
> documentation on the frequency command(s) and also the programming
> examples in the appendix. Look for words like: "pre-measurement",
> "expected frequency", and "optimizing throughput". Another good source
> is the SRS FS740 manual, as well as Pendulum CNT-91 documents.
>
> The least squares fit (regression) is ok in textbooks but, especially
> for large blocks of timestamp data, you run into loss of precision and
> range problems, as you've seen.
>
> The trick that I use is to roughly detrend the data before you compute
> the regression. I know that sounds odd, to detrend before you apply a
> formula to compute the trend, but when you look at your sums you will
> see why it works so well. We don't have access to hp or srs source
> code, but perhaps this is why regression-based counters make use of
> the expected value.
>
> Here are two examples based on 10 000 picPET timestamp data, with
> debug mode turned on [1]:
>
> (1) A not-so-pretty least squares fit directly from raw timestamp
> data. Note r^2 and steyx are suspect:
>
> sums:
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â 833333324.999999880000000 Sxx
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â 833333316.906344180000000 Syy
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â 833333320.953173520000000 Sxy
> Â Â Â Â Â 694444423810844930.000000000000000 Sxy*Sxy
> Â Â Â Â Â 694444423810842370.000000000000000 Sxx*Syy
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â 1.000000000000004 Sxy*Sxy/Sxx/Syy
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â 833333316.906344180000000 Syy
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â 833333316.906347160000000 Sxy*Sxy/Sxx
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â -0.000002980232239 Syy-Sxy*Sxy/Sxx
> stats:
> Â Â Â Â Â Â Â 10000.000000Â Â 1.000000000000000e+004 n
> Â Â Â Â Â Â Â Â Â 499.950000Â Â 4.999500000000000e+002 x_mean
> Â Â Â Â Â Â Â Â Â 499.949998Â Â 4.999499977851931e+002 y_mean
> Â Â Â Â Â Â Â Â Â 288.689568Â Â 2.886895679907167e+002 x_sdev
> Â Â Â Â Â Â Â Â Â 288.689567Â Â 2.886895665887843e+002 y_sdev
> Â Â Â Â Â Â Â Â Â Â Â 1.000000Â Â 9.999999951438083e-001 m
> Â Â Â Â Â Â Â Â Â Â Â 0.000000Â Â 2.130461211891088e-007 b
> Â Â Â Â Â Â Â Â Â Â Â 1.000000Â Â 1.000000000000004e+000 r2
> Â Â Â Â Â Â Â Â Â Â -1.#IND00Â -1.#IND00000000000e+000 steyx
>
> (2) Since this is tau 0.1 s data, I apply a pre-detrend of 0.10001 to
> the data before the least squares fit:
>
> sums:
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â 8.333333245853750 Sxx
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â 8.334142635551871 Syy
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â 8.333737930750992 Sxy
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â 69.451187898437823 Sxy*Sxy
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â 69.451187900531593 Sxx*Syy
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â 0.999999999969853 Sxy*Sxy/Sxx/Syy
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â 8.334142635551871 Syy
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â 8.334142635300619 Sxy*Sxy/Sxx
> Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â 0.000000000251251 Syy-Sxy*Sxy/Sxx
> stats:
> Â Â Â Â Â Â Â 10000.000000Â Â 1.000000000000000e+004 n
> Â Â Â Â Â Â Â Â Â Â -0.049995Â -4.999499998841863e-002 x_mean
> Â Â Â Â Â Â Â 26719.148510Â Â 2.671914850958520e+004 y_mean
> Â Â Â Â Â Â Â Â Â Â Â 0.028869Â Â 2.886895679188980e-002 x_sdev
> Â Â Â Â Â Â Â Â Â Â Â 0.028870Â Â 2.887035873203724e-002 y_sdev
> Â Â Â Â Â Â Â Â Â Â Â 1.000049Â Â 1.000048562188179e+000 m
> Â Â Â Â Â Â Â 26719.198507Â Â 2.671919850701305e+004 b
> Â Â Â Â Â Â Â Â Â Â Â 1.000000Â Â 9.999999999698527e-001 r2
> Â Â Â Â Â Â Â Â Â Â Â 0.000000Â Â 1.585249899616256e-007 steyx
>
> You can play around with this to determine the right approach given
> the frequency, batch size, quantization, and noise of your counter.
>
> And again, a suggestion to re-read the TimeLab, TimePod, 53132A, and
> FS740 literature, even once a week. The more you play with your own
> counter the more you will understand what's in those manuals. Notice
> also that these regression-based counters don't have to work with
> fixed gate times either.
>
> /tvb
>
> [1] See xystats3.c / .exe in my leapsecond.com/tools/ directory.
>
>
> On 2/6/2022 4:40 AM, Erik Kaashoek wrote:
>> 4: I've looked into the math producing the steyx and its clear there
>> are insufficient digits (16) in my math, only with low input
>> frequencies, short gate times and low subsample rate it can always
>> produce a relevant (non zero) number. I have no clue how to reduce
>> the digit count, I tried subtracting an estimated global trend but as
>> the x intervals are not constant that does not work with the integer
>> math. I tried shifting the Y so the sumxy term gets lower but that is
>> insufficient as the sumy2 is already > 1e+16 with 10 MHz input, will
>> be even worse with 100MHz input. sumy2 is > 1e+20 with 0.1 s gate
>> time. So it seems the steyx is usable to detect when measuring noise
>> but otherwise only under very specific conditions.
> _______________________________________________
> time-nuts mailing list -- time-nuts at lists.febo.com -- To unsubscribe
> send an email to time-nuts-leave at lists.febo.com
> To unsubscribe, go to and follow the instructions there.
More information about the Time-nuts_lists.febo.com
mailing list