Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Calibrating M-Series Analog Inputs using a secondary standard to measure voltage source

I have a question related to verification/calibration of the analog inputs on M-Series 16-bit multifunction devices.

 

We have a Fluke 5500A in house, but it is only 50ppm accurate. Would it be acceptable to use it, or another voltage source, along with a DMM that is 0.001% accurate? The DMM would be used to verify the voltage that is applied to the analog input during the calibration procedure.

 

Thanks,

Tim

 

0 Kudos
Message 1 of 4
(3,068 Views)

Hi Tim,

 

As per the Calibration Procedure manual, National Instruments recommends using at least a 10 ppm calibrator for 16-bit devices. With a less accurate calibrator, you will not be able to ensure that your device is calibrated up to spec.

Best regards,
Rohan B
0 Kudos
Message 2 of 4
(3,030 Views)
Thanks for the reply Rohan. I realize that I need the 10 ppm accuracy. My question was can I use a secondary standard in order to obtain the required accuracy. I'm only talking about verifying analog inputs. As an example if I use a bench power supply and adjust the voltage to the desired level and verify this level with a DMM that has the required accuracy, won't this fulfill the accuracy requirement?
0 Kudos
Message 3 of 4
(3,021 Views)

Hi Tim,

 

Sorry for the misunderstanding. You will be able to use the 5500A with a 0.001% accurate DMM as long as you use the voltage from your DMM as the reference voltage during calibration. Please let me know if you have any further questions.

Best regards,
Rohan B
0 Kudos
Message 4 of 4
(3,010 Views)