[time-nuts] TV Signals as a frequency reference

Bill Byrom time at radio.sent.com
Sun Apr 1 01:44:00 EDT 2018


In the mid-1970's (when I was an EE student in college) I built a simple
setup to compare the US color burst signal (3.5795454.. MHz) from an old
vacuum tube color television set with a commercial surplus 5 MHz OCXO
(probably from Bliley). The color burst frequency was exactly:315/88 = (63 * 5)/88  = 3.5795454.. MHz

So I built two  frequency dividers.  I think I used 7490 or 7492 TTL
IC's and a few gates. I divided the color burst signal from the TV set
by 63 to get a 56.81818.. kHz signal and compared the phase to the 5 MHz
oven oscillator divided by 88 (also 56.81818.. kHz). I had a surplus
Tektronix RM45A oscilloscope with a CA plug-in (24 MHz bandwidth) and a
surplus well-used analog chart recorder.
With this setup I could see the phase of the color burst signal of the
three major US networks (CBS, NBC, and ABC) when their local affiliate
station (Austin, Texas) was broadcasting a network feed. This was before
frame resynchronizers, so there was a glitch in the phase (and often a
frequency change which was easily detectable) when they switched away
from the network feed.
By late 1976 I was out of college at my first job (Rohdes-Groos
Laboratories), where I was the only engineer. My boss wanted us to
manufacture a low-cost frequency reference for calibration labs using
the color burst signal to discipline a local crystal oscillator. At that
time (mid to late 1970's) the three networks used rubidium or cesium
atomic frequency references to control their network feed color burst
and horizontal sync signals. NBS (later NIST starting in 1988) measured
the frequency error (and maybe the phase error - I forget) of each of
the networks on a daily basis, and published these in a document
released shortly after the end of each month (as I remember it). So you
could make local measurements and a few weeks later you could correct
for the measured error of that feed compared to the NIST reference. The
general idea was to average or curve fit the NBS errors for some
interval (a week or a month) and compare that against local measurements
you made during that interval. If the frequency error was stable or
linearly drifting you could make reasonable predictions for future
measurements, so you didn't have to wait for the NBS error reports to
get good results. We had a WWVB receiver and rubidium standard in our
office, so we could check the performance of our project.
A local company in Central Texas (unrelated to my company) developed a
simple product which detected the phase glitch in the color burst or
horizontal sync signal when the network feed phase changed. This was
used with other information to switch local programming (commercials,
etc.) into or out of the transmitted signal.
The first problem with using the network feeds to distribute frequency
was that at least one network started to look at the NBS frequency/phase
deviation reports and, thinking they were going to improve this process,
tweaked the magnetic field fine adjustment on their atomic standard in
an attempt to discipline the frequency. Of course, this removed the
ability to predict the behavior of that feed, since it might walk up or
down in frequency at any time the network tweaked their standard. So
everyone wanted the TV networks to stop tweaking their standards and
just let them slowly drift in a temperature stable environment.
But the worst problem was the introduction of frame resynchronizers.
This meant you were now measuring the local station frequency reference,
which wasn't usually all that good. So we cancelled the color burst
frequency standard project.--
Bill Byrom N5BB



On Sat, Mar 31, 2018, at 4:00 PM, Don Murray via time-nuts wrote:
> Dana...
>  
> Back in the day when out of studio news stories were
> shot on film, which was then processed at the studio 
> and broadcast from a "film chain" stations would lock
> their sync generators to the incoming network signal
> during network hours.  That allowed "clean" switching
> in and out of network programming.
>  
> When you were in the local programming portion of the
> day, the local sync generator would not be "looking" at
> the network signal for reference.  That was done because
> there may have been times when the AT&T microwave
> network was down for maintenance.  Obviously this was
> before the days of satellite delivery of the network services.
>  
> You are correct...  when the "Live Truck" came on the
> scene with instant on scene video, etc, the demand
> for frame syncs at each station went up.
>  
> Our first frame sync at, WTVJ in Miami, had been used
> at the Cape for some of the moon shots.  It was a huge
> box, occupying about two feet of rack space!
>  
> Later frame syncs, would drop in size to 1RU!
>  
> All those frame syncs were locked to our local
> master sync generator.  At one of our monitoring
> positions I could compare our local 3.58MHz
> color burst frequency to the networks and adjust
> the phase so they were in agreement.  This was
> just a good method of checking our "in house"
> reference to have it on frequency.  If the 3.58
> was on frequency, all the other outputs from
> the master sync generator would be correct.
>  
> Later sync generators were GPS disciplined.
>  
> BTW... our later model analog transmitter was GPS locked
> with one of the original HP boxes.  I remember ordering the
> HP and then WAITING forever for it to arrive.  ;-)
>  
> In the interim, the transmitter ran on it's TCXO box.
>  
> We had twice yearly frequency measurements done
> by a monitoring service up the coast.
>  
> 73
> Don
> W4WJ
>  
>  
> In a message dated 3/31/2018 11:04:12 AM Central Standard Time,
> k8yumdoober at gmail.com writes:>  
> I'e always been curious as to why TV stations did not lock at least
> their in-house equipment to the network feed as a means to avoid
> spending money on frame syncs. Remote coverage, on the other
> hand, would of course open a new can of worms.
> 
> But compared to the cost of building and powering a TV station and
> associated studios etc, a Rb or three cost a mere drop in the bucket
> to buy and maintain, so I'm baffled as to why stations in general did> not at use them on a regular basis.
> 
> Dana
> 
> 
> On Sat, Mar 31, 2018 at 12:43 AM, Hal Murray <hmurray at megapathdsl.net>> wrote:
> 
>>> As noted earlier, color burst references were a big deal a long
>>> time ago.>> 
>> Thanks. I was fishing for something modern, maybe a bit clock
>> out of the>> digital receiver.
>> 
>> I'm assuming that the digital stream is locked to the carrier.
>> That may>> not
>> be correct.
>> 
>> 
>> --
>> These are my opinions. I hate spam.
>> 
>> 
>> 
>> _________________________________________________
>> time-nuts mailing list -- time-nuts at febo.com
>> To unsubscribe, go to https://www.febo.com/cgi-bin/
>> mailman/listinfo/time-nuts
>> and follow the instructions there.
>> 
> _________________________________________________
> time-nuts mailing list -- time-nuts at febo.com
> To unsubscribe, go to
> https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts> and follow the instructions there.
> 
> _________________________________________________
> time-nuts mailing list -- time-nuts at febo.com
> To unsubscribe, go to
> https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts> and follow the instructions there.




More information about the time-nuts mailing list