[volt-nuts] 3458A calibration

WarrenS warrensjmail-one at yahoo.com
Fri Aug 7 18:05:15 UTC 2009


Randy

The general way it is done is mostly with "smoke and mirrors', otherwise known as Software scale factors and offset numbers.
The 10 Cal example:
Measure a known external approx 10V signal, tell the Unit what it should read for that voltage and it then changes its S/W scale factor to read that number.

As far as how they keep the other things like the +- 12 supplies from dominating the stability.
First they use pretty good resistors and OpAmps, BUT the trick is they say you should do an auto cal once a day or so.
That measures the internal 7 volt reference number (which it stored the value of in secure memory during the last external 10V Cal) and if the 7 volts does not read the same, it knows that the unit has changed or the 7 V ref has changed. It assumes the unit changed and redoes it's S/W scale factor to make the 7V reading the same as it was during the 10V cal.

It does the same sort of thing as the above for all the other dividers, and ranges. etc. 
It Measure a constant voltage (not necessary an accurate one) on two different ranges and using S/W scale factors makes the two reading the same. 
All it needs is to know the value of its 7 volt reference, and its 10Kohm resistance standard, everything else is done with S/W and comparing readings and rations
What makes it all practical is its near perfect linearity and high resolution. 

ws

************** 
In the HP Journal article on the calibration of the 3458A, regarding DC calibration it says:

   The internal 7V Zener reference is measured relative to an externally
   applied traceable standard.  A traceable value for this internal
   reference is stored in secure calibration memory...

Can someone explain this to me?  The 3458A's A/D converter uses +/-12V reference voltages, presumably derived from the internal 7.2V reference.  Are they saying that the external 10V standard is measured using the A/D converter or that the A/D converter's 12V reference is switched to be based on the externally applied 10V and then used to measure the internal 7.2V reference?  The former sounds like there would be too many variables to obtain a useful result, so the latter seems more plausible.  I think that I may have answered my own question.

I presume that the result of this comparison is not used directly to adjust the output of the reference, but just used to adjust later measurement results.  Right?

Related to this, how does the 3458A convert the 7.2V reference into the +/-12V references used by the converter?  Is it as simple as a couple of opamps and some precision gain setting resistors?  Wouldn't the stability of those other components then dominate the overall stability of the converter?  Is something similar done in other LTZ1000-based voltage standards (such as the Datron 491x)?

Thanks.

Randy.


      



More information about the volt-nuts mailing list