[time-nuts] 50 vs 75 ohm cables
johnday at wordsnimages.com
Thu May 10 23:10:51 EDT 2007
At 01:50 PM 5/10/2007, WB6BNQ wrote:
> John Day wrote:
> At 11:00 AM 5/10/2007, WB6BNQ wrote:
> >Hi Peter,
> >In an effort to standardize, the industry selected the mid point
> >the 35 Ohms and the 72 Ohms, that being 50 Ohms. This forced the
> >manufacturers to design their antennas for 50 Ohms or provide a
> Nice thought, but in fact the comments made earlier are more
> Early co-axial connectors go back to Belling-Lee in the UK in the
> early 20's with what has now become the IEC-692 or 'PAL' connector.
> Although not a controlled impedance connector it made the use of
> coaxial cable convenient. But like most coaxial components it need
> wait for radar to become really useful.
> During the war it was used for video and IF connections in radar
> The reason that coax was not widely used was due to cost. It was
> readily available in large quantities, so the only people able to get
> and use it were the Government and government funded research.
Oh? Where did that information come from?
> 52 ohms was in fact the compromise. In 1929 experimental work at
> Laboratories found that the ideal impedances for coaxial cable
> were 30 ohms for high power, 60 ohms for high voltage and 77 ohms
> for low attenuation. Thirty ohm cable is very difficult to make,
> very flexible and is expensive. So the Bell folk decided that 52
> was the best compromise between 30 and 60 ohms. This has become the
> 50 ohm cable we know today.
> I do not directly know of the referenced Ma Bell experiments.
> However, if such claims were made, it was certainly taken out of
> context ! Without the context or application that was being
> researched these numbers have little meaning.
I am sure a reference to the appropriate material will be forthcoming
as I recall that AT&T filed for some more patents in 1929. Again I am
truting my memory, but the scientists concerned were Lloyd
Espenschied and Herman Affel. We can assume that they were aware of
the work of Siemens in 1884, Tesla in 1894 and also Oliver Lodge
demonstrated the wave guide effect of a coaxial structure in either
1892 or 1894. In 1907 Vail (one of Bell's partners) who was to become
president of AT&T combined the Western Electric and AT&T research
facilities. Before his death in 1920 he had been pushing for a
method of increasing the bandwidth of the cable systems then in use.
Twisted pairs suffer a plethora of disadvantages.
Espenschied and Affel were charged with understanding the nature of
cables of all types this priority was laid down by Vail himself. 1929
became a very significant year - it was in January that the decibel
officially became the unit if measuring line loss in the Bell System
and in May that Espenschied and Affel reported on coaxial cable. In
1931 Espenschied & Affel was granted US patent 1,835,031 for
broadband coaxial transmission systems. If you search on him you will
find a good little bio on the IEEE wesbite. Lloyd Espenschied only
died in 1986 and I had the fortune to meet him at a symposium in 1974
when he would have been in his mid 80's. He was the inventor of the
loading coil for telephone systems and also held a patent on quartz
crystal resonators used in filters.
At Bell Labs, and anywhere else at the time, nobody really understood
what coaxial cables could do. My understanding is that they started
with a single solid centre conductor and made a whole range of cables
covering a variety of impedances. Both Espenschied and Affel had
been involved in the first carrier multiplex system produced by Bell
which was used between Baltimore and Pittsburgh in 1916. It was the
problems associated with this that finally drove the work on coax.
Affel became involved as a result of his collaboration with Arthur
Kennelly ( of Kennelly-Heaviside layer fame ) in 1916 at MIT where
they published a paper on the skin effect in conductors at high
frequencies. Kenelly had worked with Edison until 1893, the same year
in which he published a most important work on impedance in which he
used complex quantities.
This skin effect work was all the more remarkable because the
research was carried out at 100kHz.
As mentioned earlier, the Espenschied and Affel cable consisted of a
sold centre conductor placed inside a tube and supported by
insulating washers. Hard on the heels of this work AT&T laid an
experimental coaxial cable between New York and Philadelphia in 1936.
I don't know if the ARRL's QST CD-ROMs have all the advertisements in
them, but you might want to look at the December 1945 issue where
Amphenol published the first advertisements that I have been able to
find for coaxial cable.
> Why make coax for 30 Ohms when no systems were in existence that
> utilized such an impedance ?
> How does 60 Ohms differ from 77 Ohms when discussing high voltage ?
> How much high voltage are we talking about ?
> How does 77 Ohms provide lower attenuation then 60 Ohms ?
> The fact of the matter is that some concept was being studied by MA
> BELL utilizing what was currently available at the time of that
> study. Those values could have been determined based upon the quality
> of available materials of that time frame.
> If the high voltage was "DC" then the cable impedance has little
> importance, except during the rise time from off to on. Not
> withstanding dielectric leakage. If it was "AC" then :
> What was the applied frequency ?
> With no facts as to the targeted application or the design criteria,
> the reasons for the above quoted values have no merit. What does make
> sense is you provide a medium that satisfies the maximum transfer of
> power between two points no matter what the matching impedance's are.
> This would mean the cable impedance would match whatever the system
> (source and terminating) impedance is.
The figures have merit because they exist, you might like to track
down a copy of:
L. Espenschied and M. E. Strieby, "Systems for Wide-Band Transmission
over Coaxial Lines," Bell System Technical Journal 13 (October 1934).
Sadly I no longer have a collection of the Bell System Journals, but
there are plenty of collections that do have them.
> Seventy-five ohms is also a compromise, in fact a folded dipole
> if I recall correctly because I cant check any books as I am away
> from the office, a feed point impedance of 73 ohms.
> SORRY, a folded dipole is referred to as a 300 Ohm feed point.
> So 75 ohms as we
> know it now is a compromise between the low attenuation 77 ohms and
> the 73 ohm dipole feed-point.
> We also tend to think in terms of ONLY 50 and 75 ohm. But in fact
> (again IIRC) RG-8 cable is actually 52 ohms, RG-59A is 73 ohms
> (wonder why!) but RG-58B is 75 ohms. RG-11, which found use for
> cable during the war is 75 ohms. But amongst the older cables their
> are cables at 52.5, 51, 76 as well as 50, 52 and 75 ohms. Newer
> tend to be either 50 or 75 ohm with 93, 120, 125 and even 950 ohm
> available for special uses.
> >The television world originally used 300 Ohms at the antenna,
> along with
> >300 Ohm "Twin Lead" and the 300 Ohm input to the television
> itself. The
> >reason for the 300 Ohms was due to the use of a folded dipole
> which has a
> >characteristic impedance of 300 Ohms.
> Well, again IIRC, it is not actually 300 ohms, if memory serves it
> in fact more like 273 ohms - plus or minus the effect of conductor
> OK, if we are going to split hairs then the folded dipole is a 4:1
> transformation and the nominal free space impedance is 72 Ohms. So a
> 4:1 ratio would make the folded dipole, in actuality, 288 Ohms. 288
> is an awkward number to roll off your tongue, that is why it is called
> 300 Ohms.
The feed-point impedance of the folded dipole can be moved around
almost at will. See this note:
http://www.ece.msstate.edu/~donohoe/ece4990notes9.pdf - go to the
bottom of the last page for the final derivation. By changing the
spacing of the internal transmission line sections in the antenna and
their characteristic impedance you can do all sorts of things with a
I think Kraus will give the same derivations and in a lot more
detail. (Kraus, J, Antennas 2nd edition) sadly I don't have either
Antennas, or his Electromagnetics text with me, but I am sure any
decent antenna text book will give you the detailed answers.
> > The folded dipole design could be
> >tweaked to provide a wide frequency range needed to cover all the
> >frequencies, especially with the UHF channels.
> >The switch to coax for TV use came about in an effort to prevent
> >greatly reduce ghosting problems and for cable systems as a
> >means of transporting the signals to many locations. "Twin Lead"
> >tolerate being near metal objects and is unable to be buried.
> >contains the signal entirely within its own shielded structure and
> >therefore can be buried and laid next to other metal objects
> >degrading the signal quality.
> >The reason 75 Ohms was selected for the TV world was because a
> >easily constructed, 4:1 balun (transformer) would transform 300
> Ohms to
> >75 Ohms.
> That factor accelerated its acceptance. But in fact 75 ohms was
> established in the TV industry before coax found its way to the
> receiving antenna - as the cable for carrying video signals due to
> the lower losses it exhibited when coax ran all the way around
> buildings carrying video.
> SORRY, again this is simply not so !
Twin lead descended from the open-wire transmission lines used by
amateurs and broadcasters since the early years of the 20th century.
When television became a reality in about 1935 coaxial cable wasn't
available outside of the Bell System, at least not in the US. The
original transmitters in the Alexandra Palace in London in the 1930's
were placed adjacent to the studios, they didn't have anything of low
loss to move the video from video mixing (as it was known in the UK,
switching in the US). After the war they had coax - like everybody
else and despite the cost at the time they used it in huge quantities.
> Base band video is not "RF" at 60 Mhz and up. Base band video barely
> made it to 5 MHz in the Black & White days.
Once upon a time 30MHz was UHF!
> Again, "Twin Lead" was dirt cheap compared to the manufacturering cost
> of coax back in the 1950's and 1960's. "Twin Lead," if made with
> quality material, has much lower loss then coax, especially at very
> high frequencies. The TV transmitters were of lower power and their
> antennas were not all that high in gain in those early years. So,
> with relatively insensitive TV's of the early years and the poor gain
> of the receiving TV antennas, you really needed all the help you could
> OK, I am ready for round two.
OK, laughter inserted here.
No need for a battle. Each one of us understands history slightly
differently - usually coloured by our own knowledge and field of
expertise. Now I can't tell you when coax came to be used on TV
receivers in the US or the UK. But I have a sneaking suspicion it
would have occurred when KPTV in Portland, Oregon, fired up late in
1952, in the UK it would have been when BBC2 started on UHF in 1964.
It makes a certain sense also that in the US the F connector might
have been the most prominent, as it was invented in the early 1950's
by Eric Winton while working on "master antenna television systems"
MATV for Milton Shapp at Jerrold Labs. Although CATV had gotten its
start in 1948 I don't know what connectors were used. Shapp was
originally interested in apartment houses and department stores, but
Tarlton and others saw the possibility of a coaxial cable based
system carrying many channels around a town. So maybe THIS is where
coax meets the TV receiver?
Kindly, John (ex VK3ZJF)
>time-nuts mailing list
>time-nuts at febo.com
More information about the time-nuts