06-02-2011 01:51 PM
I'm new at this whole LabVIEW thing, I know this is a pretty simple question, but:
I'm creating a fixture where we have two pressure transducers -- a low pressure and a high pressure transducer. The problem is there's noise in the signal that I want to reduce -- I don't have the full LabVIEW version (only base) so I can't use the handy median PtbyPt.vi to reduce the noise, but I wrote something that I thought would work (attached). The issue is twofold:
1) I'm not getting proper pressure readouts when I know the system is at vacuum.
2) The change in pressure is too slow for the application in which this is being used -- e.g. reading from atmospheric pressure to vacuum takes minutes when it should take seconds. I think I could solve this with a case structure that analyzes the array to see if the changes in pressure vary higher than a certain amount. If so, don't take the median.
Any help would be greatly appreciated!
Solved! Go to Solution.
06-03-2011 10:58 AM
Hey there, I took a look at your program and the way you have it now it will only update your values whenever it exits the For Loop. Is this the behavior you wanted?
You can use either a case structure or just some logic and comparators to accomplish your second point. Do you want to check the array for max and min values or use something else to select your case in the Case Structure?
Are you employing custom scaling to get the readouts to be correct when the system is at vacuum?
06-03-2011 02:07 PM
Just make your own pt by pt mean. I made one here
07-05-2011 05:55 PM
Thanks for the help, I ended up tweaking the VI such that it took N samples (n=500), converting the data to an array and taking the mean of that array.
Thanks again!