[time-nuts] Continuous timestamping reciprocal counter question

Magnus Danielson magnus at rubidium.dyndns.org
Sun May 15 21:56:08 UTC 2011


Hi Fred,

On 05/15/2011 10:55 PM, Tijd Dingen wrote:
> Hi Magnus,
>
> Magnus Danielson wrote:
>>  Well, you always have the corner-case where numerical precision and near
>>  same frequency beating comes into play, so what will help and what will
>>  reduce your precision becomes a little fuzzy to say in general terms.
>>  That's why I be careful to say that "they are roughly the same".
>
> Okay, then I understand what you mean. Explaining the fun numerical
> intricacies would be a whole other thread. And quite possibly a whole
> other forum. ;-)

We are time-nuts, we dwell into details. Oh the gore and blood!
I just thought it was not the most important thing for you right now, 
keep the eyes on the road for project.

> "They are roughly the same" is something I can work with.

Great.

>>  If you run exact Nth edge you could do some algorithmic steps that
>>  avoids some rounding errors. Still, N can be allowed to be fairly large
>>  (say 1 milion). Another algorithmic benefit is that you could put your
>>  pre-processing upfront in the FPGA.
>
> Understood. Amusingly enough by the exact same token, some algorithms
> can run themselves into singularity trouble precisely because of the data
> being too regular.
>
> But rest assured I'll try several ways to do that linear regression. Right
> now was just the sanity check if I am not overlooking something stupid.

Recall that many counters does not use linear regression. It's just one 
of several algorithms. Maybe you should stock up on a few different 
algorithms and figure out which works best... and possibly when. You 
know... to learn :)

>>  They will not be greatly different as far as I can see. Do recall that
>>  linear regression may need a drift component to it. I regularly see
>>  bending curves.
>
> Check. I also plan to include a scatterplot with that line fit so you're
> able to get a feeling for the data the frequency estimate is based on.

Got to love the residue plots! A residue max/rms/min value can be 
useful. Relative values is also handy. I really miss the drift number on 
my display. When waiting for heating up oscillators or lock-ins I care 
more for seeing the rate of change than the actual number. Flipping 
between actual and relative presentation is a presentation issue and not 
a counter processing mode.

>>  You can never be quite sure you see every Nth edge. You can see every
>>  Nth edge that your digital side detected. You will need to ensure that
>>  trigger level and signal quality is good to avoid cycle slipping on the
>>  trigger side. It requires care on the analogue side and adjustments of
>>  trigger levels to the signal at hand. I've seen lost pulses and double
>>  or additional triggers too many times.
>
> In which case I think I now know that you meant by cycle-slip in the other
> post. The analog front-end for now is a large part on the TODO list.
> The digital processing part is a larger bottleneck than then analog
> frontend,
> so I am tackling that part first. If I cannot get the counter core to work,
> no point in having a fancy analog front-end...

As you should have read by now, I had something different in mind. What 
I mean here is really the analogue side of things.

>>  You would indeed be able to avoid a hardware pre-scaler, but you would
>>  need a darn good analogue front-end to make sure the input side has
>>  slew-rate needed. Lacking slew-rate can problematic and can cause you to
>>  loose cycles or get multiple triggers.
>
> Indeed, which brings all sorts of fun challenges of their own. Which is
> why for now I do not use the serdes and keep the input frequency
> relatively low.

Indeed. You can do some "digital filtering" to home in on your signal. 
Essentially creating a requirement for the signal to be in some window 
of counts... which can be used to filter out some of the trigger noise.

>>  Also, you will get a high data-rate out of the SERDES which a FW
>>  pre-scaler needs to sort out, but in parallel form rather than serial
> form.
>
> Yeah, but that is totally easy. I've already done a module that does that.
> You only need 2 stages each of 1 logic level deep with a bunch of LUT6's.
>
> However to keep things simpler on the coarse counter front, I currently
> don't use that.

It is pretty easy yes.

>>  The SERDES provides a wonderful digital front-end for high-speed
>>  signals, but the fixed sampling rate provides little interpolation
>>  powers, a 10 Gb/s SERDES can sample every 100 ps for you.
>
> Yep, which is why IMO it is better not to use the serdes as
> interpolator. You
> can use it for your coarse event counter. The main drawback to that is that
> your entire event counter is by definition sampled. This as opposed to a
> free
> running counter that counts on the events, and is then sampled.
>
> Another way of saying that is: with the serdes as a sampler, the signal from
> the DUT is nothing but data. There are no flip-flops that toggle on the
> clock
> of the DUT.

Exactly. From this parallel stream you then process out the number of 
rising/falling edges (event-counter increment for that cycle) and the 
time-interpolator values. You can do exact Nth edge with some effort.

The benefit is that you can get very high time-stamp rates at the 
relative coarse time interpolation of 100 ps.

>>  You will have to work with multiple possible trigger-locations, but it
>>  is possible to post-process out.
>
> Indeed. It is not an impossibility. It's a tradeof regarding pipeline
> complexity.

Agreed.

>> >> I have not looked on detail performance comparison between these
>> >> algorithms lately. However, they should not be used naively together
>> >> with AVAR and friends since they attempt to do the same thing, so the
>> >> resulting filtering will become wrong and biased results will be
> produced.
>>
>> > Well, for the AVAR calculation I only use the raw time-stamps. So nothing
>> > preprocessed. Then I should not have to worry about this sort of
> bias, right?
>
>>  Exactly, if you use raw time-stamps and have decent quality on tau0
>>  measures, you have avoided a lot of problems.
>
> This is good to know. Thank you for your detailed posts! :)

Happy to assist.

In the process I had a refinement on an idea of mine, so it was good for 
me to. :)

Cheers,
Magnus




More information about the Time-nuts_lists.febo.com mailing list