[time-nuts] LF power supply noise

Mike Monett xde-l2g3 at myamail.com
Sun Jun 21 18:27:54 UTC 2009


  > Mike,

  [...]

  >> One of  the reasons I was attracted to the 543310A  was  it could
  >> display 14  digits of frequency in one second. Sine then,  I have
  >> figured a way to resolve 16 digits in one second, so that part of
  >> the spec is no longer interesting.

  > As was  described  by J. J. Snyder  in  "An  Ultra-High Resolution
  > Frequency Meter"  in the FCS 1981 (as available from IEEE  UFFC) I
  > assume, basically using the fact that adding more  measurements in
  > a dense time raises the degrees of freedom and allows  for quicker
  > interpolation.

  > Modern counters like HP 53131, HP 53132 as well as Pendulum CNT-90
  > or Fluke 6690 uses similar approaches.

  I can't  find  any copies that are not payware, but  it  is unlikely
  there is any connection between the methods.

  Conventional averaging  methods are limited by the  exponential rise
  in number  of  samples  that are required to  improve  the  SNR. The
  measurement ends  up  taking too long, or  the  system  drifts which
  renders the mesurement useless.

  This provides  an  effective barrier to  the  amount  of improvement
  possible in  SNR, and the available precision that is  possible with
  conventional technology.

  Binary Sampling  works  a completely different way.  It  ignores the
  amplitude of  the  sample,  and only records  the  direction  of the
  error.

  According to  the Central Limit Theorem, the mean of  Gaussian noise
  is zero.  This  forces the Binary Sampler to  converge  on  the true
  value of the signal, and ignore the error caused by the amplitude of
  the sample.

  This is  a  really  big thing. The  problem  of  using  averaging to
  improve sigma  has existed for over 109 years. And  nobody  has been
  able to solve it up till now.

  With a heterodyne sampling system, or conventional mixer technology,
  the sample  delta is the offset frequency divided by  the  square of
  the reference frequency:

  Delta = Offset / Ref * Ref

  With a delta of 1Hz and reference of 1 MHz, the sample delta is

  1 / (1e6)^2 = 1 / 1e12,

  or 1 picosecond.

  Since the Binary Sampler discards noise, the result is  1 picosecond
  resolution in 1 second.

  I show  this  on my web site. The schematic for  the  measurement is
  shown in Fig. 1 at

  http://pstca.com/sampler/design.htm

  A simple  boxcar  smoother  is used to  integrate  the  samples. The
  result with different smoothing values is shown at

  http://pstca.com/sampler/smooth.htm

  A system  with  18.38  ps rms  jitter  would  require  averaging 338
  waveforms to  obtain  1ps  rms  jitter.  With  conventional sampling
  technology, this  would  require 169 seconds, and  the  system would
  probably drift  during  the measurement,  rendering  the measurement
  invalid.

  So it  is not possible to obtain this amount of improvement  in this
  system, and the measurement is impossible.

  The Binary  Sampler gives 1ps resolution in 1 second. This  is shown
  in Fig. 4 at

  http://pstca.com/sampler/binsamp.htm

  No other  system  can  achieve  this  performance.  And  anyone with
  sufficient skill can duplicate this result at home.

  Extending this to higher frequency, it should be possible  to obtain
  a 1Hz offset at 100MHz with 1 uHz resolution. This gives

  1 / (1e8)^2 = 1e-16 resolution in 1 second.

  I do not think any existing equipment from HP, Fluke, or Pendulum or
  anyone else can come close to matching this level of performance.

  Also, existing  technology  must deal with noise  and  the averaging
  problem. This  eliminates  much of the  performance  boundaries from
  consideration.

  As a  result,  conventional  technology cannot  reach  1e-16  in one
  second.

  > As being  reported, such mechanisms does not fair  well  with ADEV
  > calculations, and especially the overlapping variants of  ADEV and
  > den MDEV  and TDEV which was inspired by that  particular article,
  > so using it twice forms unwanted filters.

  My approach delivers continuous samples. No missing or extra bits.

  >> The 543310A  can  do   a   single-shot  time  measurement  with a
  >> resolution of  200ps,  and gets down to 1ps  with  averaging. The
  >> HP5370B does  20ps  single-shot,   and  will  resolve  100fs with
  >> averaging. But  I have figured a way to measure  2ps single-shot,
  >> and a bit better with averaging. So that part of the spec  is not
  >> so interesting any more.

  > I assume  you  really mean HP 53310A and not HP  543310A,  even if
  > your typing is consistent. The listed numbers is when weigthing in
  > how various  jitter sources combine upon averaging  and  should be
  > considered a bit conservative.

  > By all  means  describe   what   you   mean  by  2  ps single-shot
  > resolution.

  That will  cure me of trying to type complex numbers  when  copy and
  paste works  so  much better. But now I have to try  and  figure out
  what your "weigthing" really means:)

  The 2ps  single  shot  is  dead  straight  conventional time-to-time
  conversion. Nothing new there, except I think I have some new tricks
  on stabilizing  the  circuits against drift,  and  providing  a much
  faster response  to  the  zero-crossing at  the  end  of  the timing
  interval. All these reduce the noise in the sampling process.

  But the real trick is applying the Binary Sampling technique  to the
  result. That allows a huge reduction in the noise from  the sampling
  process, and converges rapidly on a much more precise solution.

  >> The 543310A will display the phase and frequency changes in a PLL
  >> step response.  But  you can get the frequency  response  just by
  >> looking at  the  VCO  DC error voltage. And if  you  look  at the
  >> voltage across the bottom capacitor in a type 3 loop, you get the
  >> phase response. Here's a picture:

  >>         |
  >>        --- C1
  >>        ---
  >>         |
  >>         |----------O < -  Phase Error
  >>         |       |
  >>        --- C2   \
  >>        ---      / R1
  >>         |       \
  >>         |       |
  >>        ---     ---
  >>         -       -

  > You should recall that when HP built their line of analyzers, they
  > where thinking  "what can we make this cool ZDT  core  do?" rather
  > than attempting to build the best analyzer for all responsens.

  I have a slightly different impression. I sold a lot of equipment to
  Boisie, Idaho,  and I used to make a lot of trips there in  my Piper
  Malibu, which was about the only realistic way of getting there from
  San Jose:

  http://www.jetphotos.net/viewphoto.php?id=5874208&nseq=0

  There was  a  lot  of political  infighting  at  Boisie  involved in
  development of  new equipment. The winner pretty  much  had whatever
  say he wanted in the direction, and he ignored the input  from other
  competent engineers.

  As a  result, the equipment was limited by  that  individual's scope
  and abilities. The result is much of HP's equipment suffered.

  One example  is the 20ps single-shot resolution of the HP  5370B has
  never been  matched by any later equipment, as far as I know.  A few
  engineers at Boisie complained bitterly about the loss.

  >> So about  the  only thing left of interest is  histograms  of the
  >> jitter. Unfortunately, the 543310A cannot store enough samples to
  >> really make an interesting graph. What I would like to be able to
  >> do is  similar to an invention I made for the disk  industry long
  >> ago, called  Phase Margin Analysis. There is a  brief description
  >> on my web page at

  >>   http://pstca.com/patents.htm#phasemargin

  > Somewhere in  my map of apps there is a HP appnote  for  doing the
  > same, to discs, intended for disc industry, back in the days.

  That may  very well be a result of my invention,  which  occurred in
  1970, was published in 1979, and was copied by IBM in the 1990's.

  But I have all the HP appnotes for disk. I don't recall any  of them
  describing what I show above. Can you provide more information?

  [...]

  > The 53310A  was  a  nice convenient tool  at  its  time,  but it's
  > performance isn't  up to spec with modern times. It seems  like HP
  > didn't pursue  it   into   much   deeper   levels   after  its VXI
  > instruments, where as others went deeper.

  I'm not  sure  about this, but I don't think there is  any  later HP
  equipment that can approach what the 53310A does.

  But I  can  now  match or beat it by orders  of  magnitude.  This is
  sufficient for  our current needs, but I will always  be  working on
  newer technology to break through the limits we now have.

  > Cheers,
  > Magnus

  Thanks,

  Mike




More information about the Time-nuts_lists.febo.com mailing list