[time-nuts] Question about frequency counter testing

Magnus Danielson magnus at rubidium.dyndns.org
Thu May 17 21:36:18 UTC 2018


Hi,

On 05/13/2018 11:13 PM, Oleg Skydan wrote:
> Hi Magnus,
> 
> From: "Magnus Danielson" <magnus at rubidium.dyndns.org>
>> I would be inclined to just continue the MDEV compliant processing
>> instead. If you want the matching ADEV, rescale it using the
>> bias-function, which can be derived out of p.51 of that presentation.
>> You just need to figure out the dominant noise-type of each range of
>> tau, something which is much simpler in MDEV since White PM and Flicker
>> PM separates more clearly than the weak separation of ADEV.
> 
> 
>> As you measure a DUT, the noise of the DUT, the noise of the counter and
>> the systematics of the counter adds up and we cannot distinguish them in
>> that measurement.
> 
> Probably I did not express what I meant clearly. I understand that we
> can not separate them, but if the DUT noise has most of the power inside
> the filter BW while instrument noise is wideband one, we can filter out
> part of instrument noise with minimal influence to the DUT one.

Yes, if you for a certain range can show that the instruments noise is
not dominant, then you measure the DUT. This is what happens as the
1/tau slope on the ADEV reaches down the DUT noise, where the resulting
curve is mostly DUT noise.

We may then hunt better counters to shift that slope leftwards on the
plot to see more of the DUT noise.

>> There is measurement setups, such as
>> cross-correlation, which makes multiple measurements in parallel which
>> can start combat the noise separation issue.
> 
> Yes, I am aware of that technique. I event did some experiments with
> cross correlation phase noise measurements.

Check.

>> Ehm no. The optimal averaging strategy for ADEV is to do no averaging.
>> This is the hard lesson to learn. You can't really cheat if you aim to
>> get proper ADEV.
>>
>> You can use averaging, and it will cause biased values, so you might use
>> the part with less bias, but there is safer ways of doing that, by going
>> full MDEV or PDEV instead.
>>
>> With biases, you have something similar to, but not being _the_ ADEV.
> 
> OK. It looks like the last sentence very precisely describes what I was
> going to do, so we understood each other right. Summarizing the
> discussion, as far as I understand, the best strategy regarding *DEV
> calculations is:
> 1. Make MDEV the primary variant. It is suitable for calculation inside
> counter as well as for exporting data for the following post processing.

Doable.

> 2. Study how PDEV calculation fits on the used HW. If it is possible to
> do in real time PDEV option can be added.

You build two sums C and D, one is the phase-samples and the other is
phase-samples scaled with their index n in the block. From this you can
then using the formulas I provided calculate the least-square phase and
frequency, and using the least square frequency measures you can do
PDEV. The up-front processing is thus cheap, and there is meathods to
combine measurement blocks into longer measurement blocks, thus
decimation, using relatively simple linear processing on the block sums
C and D, with their respective lengths. The end result is that you can
very cheaply decimate data in HW/FW and then extend the properties to
arbitrary long observation intervals using cheap software processing and
create unbiased least square measurements this way. Once the linear
algebra of least square processing has vanished in a puff of logic, it
is fairly simple processing with very little memory requirements at
hand. For multi-tau, you can reach O(N log N) type of processing rather
than O(N^2), which is pretty cool.

I hope to have an updated version of that article available soon.

> 3. ADEV can be safely calculated only from the Pi mode counter data.
> Probably it will not be very useful because of low single shoot
> resolution, but Pi mode and corresponding data export can be easily added.

You will be assured it is bias-free. You want to have that option.

> I think it will be more than enough for my needs, at least now.
> 
>> From the 2.5 ns single shot resolution, I deduce a 400 MHz count clock.
> 
> Yes. It is approx. 400MHz.

OK, good to have that verified. Free-running or locked to a 10 MHz
reference?

>>> I have no FPGA also :) All processing is in the FW, I will see how it
>>> fits the used HW architecture.
>>>
>>> Doing it all in FPGA has many benefits, but the HW will be more
>>> complicated and pricier with minimal benefits for my main goals.
>>
>> Exactly what you mean by FW now I don't get, for me that is FPGA code.
> 
> I meant MCU code, to make things clearer I can use the SW term for it.
> 
> Thank you for the answers and explanations, they are highly appreciated!

Nice! Really hope you can make sense out of them and apply them. I hope
I contribute to insight about what to do when to do good measurements.

Cheers,
Magnus



More information about the Time-nuts_lists.febo.com mailing list