[volt-nuts] 34401A Why 10M ohm default i/p resistance?

ed breya eb at telight.com
Thu Apr 10 18:12:18 EDT 2014


Only specialized meters can provide virtually infinite input R at 
voltages above the 10 to 20 V or so native range of conventional 
amplifiers, so you have to use some kind of attenuator to cover the 
higher ranges anyway. 10 megs and 1 meg (and sometimes 11) are the 
traditional values used, with 10 of course providing less loading on 
the signal source. It is difficult to get good resistor precision, 
stability, and voltage coefficient at higher values, so 10 megs is a 
good compromise.

As far as I know, no DVM uses an actual 10 or 1 gigaohm resistor for 
its input termination - that's just an equivalent input R range 
(sometimes just a part spec right from the datasheet) for the high 
impedance opamps and JFET circuits typically used to amplify DC at 
reasonably high accuracy and low noise. All it means is that it is 
nearly non-loading to conventional DC circuits. If there is an actual 
resistor this large in there, then it is just to get the input near 
zero when it's disconnected - it will read the bias current times the 
resistance, which can be quite large. If the applied input voltage 
exceeds the native range, the protection circuitry will take over.

For ultra-high Z applications, the equivalent input R would need to 
be in the teraohm range instead, using electrometer-class opamps, 
with much lower bias current, but higher offset voltage and noise.

If you put DVMs in the low ranges below 10 or 20 V, ones without 
actual termination R will tend to drift off due to input bias 
current. Once it's connected, the effect is much smaller (but not 
zero) since the source R is usually comparatively very small. One way 
to always assure a zero reading is to have a definite and fairly low 
(to not show bias current too obviously) input R, so there's the 10 
megs option. It's also possible to make the actual value of the input 
R (and not just the dividing ratios) very precise - or measure it - 
so that its effect on measurements at known source resistances can be 
figured out.

As you have already figured out, in auto-ranging, a non-terminated 
DVM left disconnected and unattended will form a relaxation 
oscillator and tend to wear out its front-end relays. Seeing no 
signal in the higher ranges, the system will switch down to the lower 
ranges and be OK until the input drifts off to a range limit, then it 
will up-range until it reaches one with an attenuator, then the 
signal goes back to zero, and the process repeats.

Ed

At 07:23 AM 4/10/2014, you wrote:
>There is no suggestion in the specifications for the 34401A that the 
>accuracy suffers by selecting 10G ohm input resistance on the .1 to 
>10V range so why would they make 10M ohm the default? I can think of 
>very few cases where having the 10M ohm i/p resistor switched  in is 
>better for accuracy than not.
>
>On the other hand 10M is sufficiently low to produce significant 
>errors on a 6 1/2 digit DVM for sources with resistances as low as 
>10 ohms. Measuring 1V divided by a 100k/100k ohm divider for example 
>causes a .5% error - 502.488mV instead of 500.000mV. That might not 
>be a problem but I wouldn't be surprised if this catches a lot of 
>people out (including me) when not pausing to do the mental 
>arithmetic to estimate the error. It's just too easy to be seduced 
>by all those digits into thinking you've made an accurate 
>measurement even though you discarded those last three digits.
>
>And if it's not a problem then you probably don't need an expensive 
>6 1/2 digit meter in the first place.
>
>It's a small point I agree but it can get irritating to have to keep 
>going into the measurement menus to change it when the meter is 
>turned on when measuring high impedance sources (e.g. capacitor 
>leakage testing).
>
>It can't be to improve i/p protection as 10M is too high to make any 
>significant difference to ESD and in any case there is plenty of 
>other over-voltage protection. OK. it provides a path for the  DC 
>amplifier's input bias current, specified to be < 30pA at 25 degrees 
>C, but I imagine that varies significantly from one meter to the 
>next, and with temperature, so not useful for nulling out that error.
>
>So why would they do this? Could it be psychological? By limiting 
>the drift caused by the i/p bias current to 300uV max when the meter 
>is left unconnected? A voltmeter with a rapidly drifting reading 
>(several mV/s) when not connected to anything is a bit disconcerting 
>and would probably lead to complaints that the meter is obviously 
>faulty to users who are used to DVMs which read 0V when open circuit 
>- because they have i/p resistance << 10G ohms and don't have the 
>resolution to show the offset voltage caused by the i/p bias current.
>
>Personally I'd have though that the default should be the other way 
>round - especially given that there is no indication on the front 
>panel or display as to which i/p resistance is currently selected.
>
>Any thoughts? What do other meters do?
>
>Tony H
>_______________________________________________
>volt-nuts mailing list -- volt-nuts at febo.com
>To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/volt-nuts
>and follow the instructions there.




More information about the volt-nuts mailing list