01-30-2025 01:01 PM
With the 7846R target I/O config set to "raw" so my outputs work correctly I need to convert the analog inputs back to calibrated since the configuration mode can't be set for input and output separately. I used the reference example code located here: R-Series IO Conversion - Discussion Forums - National Instruments
In calibrated mode, I can wire the analog input to the "DC and RMS Measurements" vi directly and get the correct result (4.5Vrms) but in raw mode after performing the conversion from the example code and shown below, I am only getting 4Vrms.
I admittedly do not understand the conversion process and I have tried to read through the NI provided literature (e.g., Working With Calibrated and Uncalibrated Data on NI R Series - NI) and some of the related discussion forum posts, but I am still not getting it. Any help would be appreciated.
Code for reference:
01-31-2025 07:43 PM
For 16-bit ADC with an input range of +/- 10V, each integer raw level is equivalent to
(20) / ((2^16)-1) = 0.000305 V or 0.305 mV
However, in your conversion, you didn't include the calibration offset as shown in Working With Calibrated and Uncalibrated Data on NI R Series
02-01-2025 12:11 PM
Hi ZYOng, as always, I appreciate your responses!
I understand the resolution aspect of the on-board ADC but the problem with the information in your attached article is that it is assuming a host application where the LSB weight and offset are being passed to the host application and then through "Binary to Nom.vi".
I cannot find that vi anywhere, the path to the provided vi in the article doesn't exist and googling yields no results. Additionally, that vi seems to be intended for use on the host application, not the FPGA vi.
With that in mind, how would the offset parameter be applied to the conversion I am using in the FPGA vi?
Thanks!