06-09-2010 08:03 AM
I have a question related to verification/calibration of the analog inputs on M-Series 16-bit multifunction devices.
We have a Fluke 5500A in house, but it is only 50ppm accurate. Would it be acceptable to use it, or another voltage source, along with a DMM that is 0.001% accurate? The DMM would be used to verify the voltage that is applied to the analog input during the calibration procedure.
Thanks,
Tim
06-11-2010 09:36 AM
Hi Tim,
As per the Calibration Procedure manual, National Instruments recommends using at least a 10 ppm calibrator for 16-bit devices. With a less accurate calibrator, you will not be able to ensure that your device is calibrated up to spec.
06-11-2010 10:22 AM
06-11-2010 12:44 PM
Hi Tim,
Sorry for the misunderstanding. You will be able to use the 5500A with a 0.001% accurate DMM as long as you use the voltage from your DMM as the reference voltage during calibration. Please let me know if you have any further questions.