[time-nuts] 50 vs 75 ohm cables

WB6BNQ wb6bnq at cox.net
Thu May 10 15:00:04 UTC 2007


Hi Peter,

If you make a dipole (not folded) antenna for any frequency it will have
a characteristic impedance of 72 Ohms in free space.  It will also have,
essentially, the same impedance until you get real close to the ground.
A properly made electrical 1/4 wavelength vertical antenna will have a
feed point impedance of 35 Ohms.

In the old tube equipment, the output stages were designed with variable
components such that the adjustment range could compensate for a wide
range of impedance's, both above and below the above values.  This was
easy to do because the tube plate impedance's were in the order of 2000
to 5000 Ohms.  The transformation was easily handled by variable
components.

In the solid state world things are a bit more complicated.  First the
solid state devices were typically very low impedance.  The low impedance
made it possible to design very wide band circuits without the need to
adjust anything when changing frequencies, unlike the tube counterparts.
Unfortunately the wide band design requires selecting a fixed output
impedance that cannot easily, if at all, be made adjustable.

In an effort to standardize, the industry selected the mid point between
the 35 Ohms and the 72 Ohms, that being 50 Ohms.  This forced the antenna
manufacturers to design their antennas for 50 Ohms or provide a matching
network.

The television world originally used 300 Ohms at the antenna, along with
300 Ohm "Twin Lead" and the 300 Ohm input to the television itself.  The
reason for the 300 Ohms was due to the use of a folded dipole which has a
characteristic impedance of 300 Ohms.  The folded dipole design could be
tweaked to provide a wide frequency range needed to cover all the TV
frequencies, especially with the UHF channels.

The switch to coax for TV use came about in an effort to prevent or
greatly reduce ghosting problems and for cable systems as a reliable
means of transporting the signals to many locations.  "Twin Lead" cannot
tolerate being near metal objects and is unable to be buried.  "Coax"
contains the signal entirely within its own shielded structure and
therefore can be buried and laid next to other metal objects without
degrading the signal quality.

The reason 75 Ohms was selected for the TV world was because a simple,
easily constructed, 4:1 balun (transformer) would transform 300 Ohms to
75 Ohms.  Trying to go from 300 Ohms to 50 Ohms would require a 6:1 ratio
with increased I/R losses and greater difficulty in obtaining wide band
operation in the early days of ferrite mixes.
I am sure there were other considerations in the process, but at the
moment I cannot think of any.  At any rate, the above is the primary
reason why things went the way they did.

I see that a range of comments have been already made, but I think they
are off base.  The natural constants of the universe dictate the natural
phenomena of antenna systems and that is what dictated the course of
history.

I hope this helps explain the history a bit.

Bill....WB6BNQ

Peter Vince wrote:

> I came across some telecom equipment the other day which had
> reference outputs marked as 75 ohms.  I work in television, not
> telecoms, and we use 75 ohm connections for video, but with most RF
> stuff being (I believe) 50 ohms, and certainly all the HP and other
> counters seem to have 50 ohm inputs, I rather assumed telecoms used
> 50 ohms - obviously not!  Can anyone tell me how and why the 50/75
> ohm distinction came about?  Was it perhaps a VHS/Betamax type issue
> with different manufacturers going their own ways, and then with so
> much equipment out in the field,  neither side was willing to change?
>
>         Thanks,
>
>                 Peter (London)
>
> _______________________________________________
> time-nuts mailing list
> time-nuts at febo.com
> https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts





More information about the Time-nuts_lists.febo.com mailing list