[time-nuts] Phase Noise and ADCs

John Ackermann N8UR jra at febo.com
Sat Sep 26 15:10:08 UTC 2020


We know that phase noise scales with frequency, so if you multiply 
frequency by 10 you get a 20 dB increase in noise.

What I don't fully understand is how that relationship works with other 
than simple multiplication/division.

For example (and my real life concern), if I have an analog to digital 
converter that is clocked at 122.88 MHz and know the phase noise of that 
clock signal, what do I know about the effective phase noise when the 
ADC is receiving a signal at, e.g., 12.288 MHz?

In other words, if I were to measure the phase noise at the output of 
the ADC when fed a high-enough quality 12.288 MHz signal, would I see 
something like the 122.88 MHz phase noise, or something better due to 
the scaling by 10?

Thanks!

John






More information about the Time-nuts_lists.febo.com mailing list