10-01-2013 11:25 AM
Hi all,
I'm uncertain about what input voltage range my DAQ card will use in my situation.
I'm using a NI 9205 analog input module in a cDAQ 9188 chassis. The 9205 has programmable input ranges of +/-200 mV, +/- 1 V, +/- 5 V and +/- 10 V. I'm programming in VB.NET in Measurement Studio 2010 for Visual Studio 2010.
I'm measuring DC voltages from multiple sensors, all in the "0 - X" VDC range. For example, some of the sensor outputs are 0-6 VDC, 0-4 VDC, 0-10 VDC, etc. I'm individually programming each input range for each analog input channel. For example:
AITask.AIChannels.CreateVoltageChannel("/AI1/ai0", "", AITerminalConfiguration.Differential, 0.0, 10.0, AIVoltageUnits.Volts) for my 0-10 VDC sensor.
One of my sensors outputs a 0-2 VDC signal. I understand that the DAQ module will choose the best range for me based on my input settings.
Here is my uncertainty: does the module choose a setting based on the "magnitude" of the upper voltage range that I provide, or the "difference" in the minimum and maximum voltages of the sensor output. For example, the range is 0-2 volts. Will it choose +/-1 as the setting (which spans a difference of 2 volts)? Or will it choose +/- 5, because 5 is a "larger magnitude" than 1 and the next available range in the list is +/-5? Understand my uncertainty? If the sensor outputs 0-4 VDC then I understand it using +/- 5; a range of 4 volts will not filt into +/- 1, but it will into +/- 5.
Also, is there a function somewhere in the API I can use to query the setting that has been chosen based on my input ranges?
Thanks!
10-01-2013 09:17 PM
Definitely +-5V. Try checking RangeHigh and RangeLow property of AIChannel class.
10-03-2013 08:53 AM
Thanks