[volt-nuts] Agilent calibration
Charles Steinmetz
csteinmetz at yandex.com
Wed Aug 14 01:41:43 EDT 2013
Joe wrote:
>The way I read this is that if I send them a DMM that is within spec, they
>won't adjust it or provide pre/post data. Is this the case? If I spend over
>$200 sending a DMM to them, I want it adjusted to the best possible specs
>and I want the data. I do not want someone just saying that it is good
>enough and send it back to me. I can get that for $50 in El Paso.
The big difference is not between adjusting and not adjusting -- it
is between getting a calibration "with full data" and getting one
without data. /The true value of calibration is not the adjustment
-- it is the data./
Agilent doesn't just say it is good enough -- they tell you
specifically how far off it is and quantify the statistical
uncertainty of their measurement. That is everything you need (i) to
correct readings you make with the instrument and (ii) to be
confident of the potential uncertainty of those measurements.
Let's say your meter has an uncertainty spec of +/- 15 uV (1.5 ppm)
total at 10 V. If your calibration certificate says the meter reads
dead on at 10.000000 V, the reading shown on the display is your
measurement result (with a certainty of 1.5 ppm, or +/- 15 counts
from the reading) when you measure a 10 V source. But the cal
certificate could just as well say that the meter reads 10.000008 V
when measuring a 10.000000 V source. In that case, you know to
subtract 0.000008 V from whatever the meter reads when you measure a
10 V source to get your measurement result (again, with a certainty
of 1.5 ppm, or +/- 15 counts from the corrected reading). Of course,
in the real world a voltage standard will have its own calibration
offset, so you will make two corrections when you measure your house
"10 V" standard to verify that your meter is still in calibration.
So, why wouldn't they adjust every instrument to be "spot
on"? Because metrologists have determined that, as a general matter,
not messing with the adjustments results in overall better stability
of instruments.
Adjusting instruments inevitably causes a new drift and settling
cycle, so if you adjust everything as close to perfect as possible
every time you calibrate, you will always be on the steepest portion
of the settling curve. On the other hand, you can benefit from the
long, ever-decreasing tail of the settling cycle by not adjusting as
long as the instrument is within the manufacturer's
specifications. Further, seeing the change from one calibration
interval to the next, and the next, etc., increases the confidence
you have in readings you make when the last calibration is not so
fresh any more.
Best regards,
Charles
More information about the volt-nuts
mailing list