[time-nuts] 50 vs 75 ohm cables

wa1zms at att.net wa1zms at att.net
Thu May 10 13:43:05 UTC 2007


Peter-

Maybe others have hard historical evidence, but my understanding is that the 50 ohm cables were selected because of better power handling capability in high power RF applications and the 75 ohm standard was used for lower loss applications. That's why CATV folks use 75 ohm cables. It might be only a fraction of a dB lower loss than 50 ohm cables (say in a 100m length) but over many miles of cable it all adds up and has an advantage.

Cables lower than 50 ohms have an even better power handling advantage, but the weight of the cable starts to increase and the value of 50 was a compromise.

-Brian, WA1ZMS


-------------- Original message ----------------------
From: Peter Vince <pvince at theiet.org>
>
> I came across some telecom equipment the other day which had 
> reference outputs marked as 75 ohms.  I work in television, not 
> telecoms, and we use 75 ohm connections for video, but with most RF 
> stuff being (I believe) 50 ohms, and certainly all the HP and other 
> counters seem to have 50 ohm inputs, I rather assumed telecoms used 
> 50 ohms - obviously not!  Can anyone tell me how and why the 50/75 
> ohm distinction came about?  Was it perhaps a VHS/Betamax type issue 
> with different manufacturers going their own ways, and then with so 
> much equipment out in the field,  neither side was willing to change?
> 
> 	Thanks,
> 
> 		Peter (London)
> 
> _______________________________________________
> time-nuts mailing list
> time-nuts at febo.com
> https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts






More information about the Time-nuts_lists.febo.com mailing list