[time-nuts] Re: Another reason to monitor line frequency :) - My AC measurement project & question

ed breya eb at telight.com
Sat Jan 22 23:30:54 UTC 2022


I'd vote for going with a transformer too, but not just any old 
transformer - I'll explain later.

You can indeed connect directly to line with an AC divider, and measure 
the signal. In fact, you can even build a very broad band probing system 
that can go all the way down to DC, and up to RF, by making something 
equivalent to a high (1 meg) impedance oscilloscope front end. There are 
complications though. Measurement-wise, what would be needed is actually 
a differential system, looking between the neutral and line, so two 
identical attenuators are required to get and preserve the signals for 
subtraction and processing. It is difficult to get the balance and 
symmetry needed for good CMRR at higher frequencies, so line frequency 
and plenty of its harmonics can be handled OK, for good waveform 
fidelity well into audio, but the higher frequency differential and 
common-mode junk will blow right through. You would need all sorts of 
filtering and clamping to control the signal quality and protect the 
instrumentation. The input resistance should be kept high, like 1 meg, 
to be safer against faults, and to prevent tripping GFCIs (RCDs). For 
instance, if you're looking at 120 VAC on a 1 meg front, the hot side 
input current will be around 120 uA RMS, adding only a small amount to 
the earth ground loops, compared to a typical 5 mA GFCI trip point.

The big problem with this, as you can see from the comments, is doing it 
safely, with properly rated parts, fault protection, and circuit layout 
and construction (especially clearance and creepage distances). Anyway, 
it can be done, but tends to be a PITA to do it right.

An obvious question is how much fidelity is needed. No matter how good 
your measuring system is, the results are only valid at the point you 
sample, and only somewhat representative of what you'd see at another 
spot - even in your own home. The incoming mains at your load center may 
be pretty solid, but every branch circuit will look a little different, 
depending on the loads and distance and so on. When appliances turn on 
and off, things will change throughout the system, and there will always 
be transients, and ubiquitous HF and RF interference from all the 
electronic gear in the system. If you look at it with high bandwidth, it 
can appear pretty disgusting, but it works for the main purpose of 
distributing plenty of power.

So, in order to remain blissfully ignorant of how ugly it may be, and to 
rig up something simple, safe, and easy, a transformer is the way to go. 
Regardless of the chosen one, some basic protections are usually 
desired, like first a small fuse or PTC on the hot side,  to protect the 
transformer in case you accidentally short the secondary - a high 
probability event when designing and experimenting. If the setup is 
experimental, and gets connected or changed around a lot, or you're 
fooling around on the primary side, it's a good idea to fuse the neutral 
connection the same way, so getting a cord reversed or such, won't 
reduce the protection function. Next, transient protection like TVSSs or 
MOVs can help to protect the parts and measuring equipment. Transformers 
are built to handle all this, so don't really need it - it's mostly for 
the other stuff. BTW I noticed in your recent post that you were 
plugging into a power strip to get some protection. I'd recommend not 
bothering with this - it will tend to cause more distortion, and if it's 
shared with other loads, you'll be including their effects too, making 
it even more removed from the true mains signals. Build the protections 
into the unit itself, and you won't have to worry about placing it 
anywhere in the system.

Now for transformer selection. First, remember that power transformers 
are not built for signal integrity. They are optimized for maximum 
cheapness that adequately meets the specs required for power transfer 
and packaging. The biggest cost factor is the amount of core and winding 
material needed to get the job done, so there are all sorts of 
trade-offs involved. The main thing is to use the smallest core 
possible, running at the highest flux density possible, along with the 
least amount of copper in the windings, to provide the function with 
"acceptable" core and winding loss, which typically may be 5-10 percent 
of VA rating. Often, a temperature rise spec is provided, indicating the 
total real power loss in operation.

The simplest, biggest improvement you can make in signal fidelity, is to 
get the flux level down. The easiest way is to use a transformer with 
much higher primary voltage rating versus the line signal size. IOW, for 
120 VAC, use a transformer with 240 V primary. For a given VA size, this 
will give four times the magnetizing inductance, one fourth the 
magnetizing current, half the flux level, and one fourth the equivalent 
VA rating (if you were to use it as a power transformer), versus running 
at rated voltage. This greatly reduces core loss and improves linearity. 
The winding losses become very small in this application, since it's a 
signal transformer. The load on the source (line) is mostly the 
magnetizing current, through the primary resistance. The secondary will 
have virtually no load, since the (high impedance) instrumentation is 
just looking at the voltage, so hardly any wire loss.

Transformers are naturally bandpass filters, and are already 
differential in normal use, and provide good isolation from the line 
environment.

There are other options to get even better performance. More on that later.

Ed





More information about the Time-nuts_lists.febo.com mailing list