09-15-2023 12:02 PM
Hello,
I am working on an application that uses a PXIe-5105 digitizer and I am wondering if anyone knows a bit about how this particular device responds to input signals outside of its configured vertical range. In particular, I am interested to know if there is any way to programmatically detect when an input signal goes outside of the configured vertical range preferably via the NI SCOPE API VIs or even through other methods. I am aware that the voltage signal clips, which is fine, but I would like to have some way to programmatically detect when this clipping occurs if possible.
Does anyone know if what I am thinking of possible/feasible to implement programmatically?
I am developing the application using LabVIEW 2019 on a Linux RT target.
Regards,
Johnathan
09-21-2023 08:30 AM - edited 09-21-2023 08:32 AM
Read the I16 raw plus scaling info. do a range check for the top and botton 16 bit of your 4096 bits?
Since the actual range is usually a bit higher than the nominal, you can calculate the actual boundaries by looking at the gain and offset properties (or wfrm_info)
What check is faster ? inrange on an array or other comparisions or ???
Maybe the NI-Scope read or fetch already give a warning? Have you checked the error out for warnings?
09-25-2023 10:47 AM - edited 09-25-2023 10:48 AM
Henrik_Volkers, can you elaborate on what you mean by "Since the actual range is usually a bit higher than the nominal, you can calculate the actual boundaries by looking at the gain and offset properties (or wfrm_info)" ?
As an update, I was able to speak to a NI Technical Support Engineer about this and apparently there doesn't appear to be anything in the properties available for these devices in the NI-SCOPE examples to query for the event of clipping. The TSE advised reading the 1D I16 or 1D I32 data and inferring clipping happened if the maximum integer value is reported by the fetch. This doesn't allow distinguishing between the signal actually being at the maximum value or if clipping occurred, so their suggestion was to use a range bigger than the expected signal and infer programmatically clipping occurred rather than inferring that the max / min value is the true measurement value.
Maybe at some point in the future NI could add a feature for clipping detection in their digitizer products, but it doesn't appear to exist currently.
Regards,
Johnathan
09-26-2023 03:07 AM - edited 09-26-2023 03:17 AM
The actual range is usually (always) a bit greater than the nominal range , say your range is 2 V (-1 V to +1V) the actual range that is measured is maybe -1.123 V to +1.125 V.
here is a quick block diagram of what i think could be a range check for I16 data, can't check it since my units are in use 😉
the warning bit range should be a positive value in the range 0 to ?? (maybe 5 bits or you calculate it from the actual to nominal range)
09-26-2023 03:24 AM
and as a little addon: a range used indicator 😄
09-26-2023 10:05 AM - edited 09-26-2023 10:07 AM
Henrik_Volkers, okay I see now. I did not know the actual range would extend past the set range. In the case that it does, than what you recommended makes sense and I might try out what you suggested.
Effectively, you're recommendation aligns with what the technical support engineer advised, with the added information that the actual measurement range likely extends past the set range.
Thanks for the examples above.
Regards,
Johnathan