[time-nuts] Rb as source for ADEV?
Tom Van Baak
tvb at LeapSecond.com
Thu Feb 6 22:22:55 EST 2014
> I never knew about these different versions of ADEV.
> Can you point me to any reference?
There are a couple of separate issues regarding ADEV.
In old literature ADEV was computed using adjacent segments of data. This is about all you could do with a HP 5360A computing counter. Once real computers got into the game, it was possible to use the "overlapping" version of ADEV, which "milks" more information from the data set. You can see the two different formulas for computing it at: http://www.wriley.com/paper2ht.htm
See calc_adev() source code at: http://leapsecond.com/tools/adev_lib.c
Really, the only thing the overlapping version does is use a "stride" of 1 instead of tau. This is possible when you have the whole data set in memory. The more primitive back-to-back ADEV could be computed as a summing sum, requiring no data storage at all (typical of 60's and 70's instruments).
Regardless of back-to-back or overlap, there's also the question of how many points to plot. Again, in the early days, because both computation and plotting was very time consuming, people tended to plot only a few points per decade. Maybe tau 1,10,100,1000 or 1,2,5,10,20,50, or 1,2,4,8,16,32, etc. To make it look more like a graph they would connect the dots with lines (and guessing). These days, calculating ADEV is so fast there's no need to even draw the lines; just compute ADEV for every tau you can imagine and the dots connect themselves due to their density.
Stable32 has an "all tau" option in which case ADEV is computed for every possible tau. E.g., 1 to 100,000. However, it turns out this is overkill. Not so much for small tau (say 1 to 100), but once you get up to the thousands or tens of thousands there's usually no significant difference between using tau N and N+1. And it can actually take a lot of time to compute ADEV hundreds of thousands of times. So we are now in the era of "many tau" which computes lots of tau *per decade*. Think of it as a logarithmic sweep of tau instead of a linear sweep. For large data sets this is orders of magnitude faster than "all tau", yet it still fills in all the gaps in the plot with real points, not extrapolated lines. Note that Timelab does "many tau" by default.
And the third issue is, of course, what child in the ADEV family to use: ADEV, MDEV, TDEV, HDEV, etc.
Does all this make sense? I can post graphic examples of all these issues if you're interested.
More information about the time-nuts