[time-nuts] Time/freq from digital TV

David I. Emery die at dieconsulting.com
Sat Jun 13 21:54:05 EDT 2009

On Sat, Jun 13, 2009 at 03:18:16PM -0700, Hal Murray wrote:
> My radio has many news stories about the end of analog TV.
> What sort of time or frequency can I get from a digital TV signal?
> Now that frame buffers are common, does each station use its own master clock?

	I don't believe there is any standards grade time or frequency
that one can derive from a FCC legal standards conformant ATSC
transmission in digital format.

	There are fairly loose specs for ATSC symbol rate (by time-nuts
standards) and carrier frequency - not dead nuts on in any broadcast
plant unless someone engineering it really cared for some reason.

	And nothing in the ATSC stream establishes a precise epoch - the
stream consists of 188 byte transport stream packets... sent from a
queue when a time slot is available... with no particular discipline
that a  particular packet be sent at a particular exact time.

	There is, however, a timing mechanism in MPEG2 TV based on a 27
MHz clock that controls and coordinates video and audio rendering in
digital TVs ... and part of this is that some packets carry a timestamp
in ticks of this clock (called the PCR) so a receiver can lock its own
27 MHz video timebase to this clock so video frames and audio samples
are output at the correct times - mostly relative to each other and the
video source, of course, and not any absolute UTC time.

	Broadcast plant may or may not lock its master 27 MHz PCR
reference to a GPSDO or rubidium - there is no requirement to do so. 
And no particular expectation that some epoch of the PCR clock matches a
particular UTC epoch.

	Likely the choice is to use the OCXO in a house master sync
generator or ATSC multiplexer  as the reference or use an external clock
input from some kind of frequency standard.  I suppose there ARE
stations that use a decent GPSDO or other high grade frequency source
for this... how one knows this is true in any particular case, however,
is not clear.

	Most network broadcast distribution is via satellite, and
satellites move around in their box in the sky so any time or frequency
derived from satellite downlinked signals is subject to Doppler shifts
over the course of a day as the satellite completes its figure eight
pattern in the sky  (most operational birds are slightly inclined, thus
the figure 8 - none are dead nuts on the equator and in perfectly
circular orbits).   This means that no network timing from a satellite
signal is stable by precise metric standards... even if the uplink signal
is right on.

	What all this means in practice is that there is no longer any
precise broadcast TV signal that can be depended on as really accurate. 
Of course this has already been true for at least the past 20 years with
analog NTSC transmissions from ubiquitous digital broadcast plant... for
the most part the timing for the old analog NTSC transmissions was
derived from the OCXO (or even just TCXO) in either a master sync
generator or the digital NTSC modulator just before the transmitter
modulator analog input... long long gone are the 1970s era days of a
completely analog and un frame buffered path between a stable analog 
terrestrial microwave based link to a master rubidium or cesium network
clock in NYC and the input to the transmitter...

  Dave Emery N1PRE/AE, die at dieconsulting.com  DIE Consulting, Weston, Mass 02493
"An empty zombie mind with a forlorn barely readable weatherbeaten
'For Rent' sign still vainly flapping outside on the weed encrusted pole - in 
celebration of what could have been, but wasn't and is not to be now either."

More information about the time-nuts mailing list