[volt-nuts] 34401A Why 10M ohm default i/p resistance?

Joel Setton setton at free.fr
Thu Apr 10 13:58:02 EDT 2014


I think the 10 Meg default value became a de facto standard at the time 
of VTVMs (vacuum-tube volt meters), as a convenient value which reduced 
input circuit loading while remaining compatible with the grid current 
of the input triode. Designers of early solid-state voltmeters merely 
decided not to change a good thing.
Just my $0.02 worth!

Joel Setton


On 10/04/2014 18:55, Steven J Banaska wrote:
> As Tom said the 10M input impedance is used for the high voltage ranges
> because it is a resistive divider (9.9M/100k) that can handle high voltages
> without much drift. Caddock THV or HVD are fairly common in precision dmms.
>
> Typically you will find a high impedance (10G) path that can be used for
> the ranges 10V and lower, but the 10M divider can be left connected and
> will work for any voltage range by changing which side you measure. As you
> mentioned there can be an accuracy sacrifice when you have a high output
> impedance from your source. I'm not sure why 10M is the default other than
> it may extend the life of the relay that switches the 10M divider in or out.
>
> Steve
>
>



More information about the volt-nuts mailing list