When displaying analog values from sensors like Temperature probes ( PT
100 ), flowmeters and pressure values it is quite irritating to see the
second decimal or at times even the first decimal point rolling
rapidly. The card is a PCI 6229 and with LV8.0 on WIn-XP
Most times it is due to overriding noise on the signal. But even when
we follow strict guidelines with respect to shielding, grounding,
differential mode inputs I still cannot completely eliminate this. And
all my channels have low pass filters to remove the AC line noise and I
also resort to a moving average of atleast 5 samples. One would
expect a channel like Room Temperature to be stable - it simply cannot
vary that fast .
What I want to see on the screen is a display that is as steady as an
analogue meter. Is there any proven method in software apart from
moving average to do this ? Reducing the display precision would be one
another way - but then it also reduces the resolution of the required
display.
Thanks
Raghunathan
Raghunathan
LabVIEW to Automate Hydraulic Test rigs.