LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Low Pass Filter and Mean signal processing

Solved!
Go to solution

Hi guys, 

 

I am using a LabVIEW and a DAQ to measure voltage from a strain gauge. Specifically I am interested in the rate of change of the mean of one set of samples to the next. In my current setup I am taking the average of 500 samples taken at a rate of 1000 hz. and comparing it to the next set using a shift register. 

 

To further clean up the data I have added in a low pass filter. It is working, but no data will be produced if I set the cutoff  frequency to 500 hz or greater. This makes sense as I think the 500 samples must be collected by the daq prior to being converted to an array and sent to the filter (please correct if this thinking is wrong).

 

Anyways, I guess what I am wondering is.. Is there some sort of multiplication factor that I can think about for seeing what frequency I'm actually filtering out since it's filtering out the frequency in these arrays? Or am I just completely off on this?

 

To summarize my question -> why is that ~ 500 hz is my max cutoff frequency when my sampling rate is 1000 hz?

 

Thanks!

0 Kudos
Message 1 of 5
(510 Views)
Solution
Accepted by topic author LabView_Ben

@LabView_Ben wrote:

 

To summarize my question -> why is that ~ 500 hz is my max cutoff frequency when my sampling rate is 1000 hz?

 

Thanks!


The Nyquist-Shannon sampling theorem (Nyquist) states that a signal sampled at a rate F can be fully reconstructed if it contains only frequency components below half that sampling frequency: F/2. This frequency is known as the Nyquist frequency. 

 

Your filter works up to the Nyquist frequcny, 500 Hz.

 

https://mathworld.wolfram.com/NyquistFrequency.html

 

https://www.gatan.com/nyquist-frequency#:~:text=The%20Nyquist%2DShannon%20sampling%20theorem,shown%2....

 

Message 2 of 5
(483 Views)

Ahh yes, makes sense. The reason for my confusion is that I was playing around with the example code called extracting the sine and I mixed up the sampling rate and cutoff frequency.

 

Thanks for the help!

 

 

0 Kudos
Message 3 of 5
(423 Views)

Just another question if you don't mind...

 

For setting my cutoff frequency and order of my filter, my plan was to measure the voltage at an unchanging state with different orders and cutoff frequency until I find the least noisy results...

 

I feel like this is unscientific, but I can't think of a better way to determine these values. If I was measuring a periodic function then obviously I would be able to set my cutoff frequency based on the known frequency of my function.

 

But since I am simply measuring a DC voltage, I think trial and error is the best way to set the values.

 

Is there a better way to do this?

 

Thanks again

 

 

0 Kudos
Message 4 of 5
(409 Views)

Since you are measuring a DC voltage:

  1. White noise and some other noises are proportional to the frequency bandwidth. So an easy way to reduce your some of your noise would be to reduce the sampling rate of your measurement. You can experiment with this by looking at the "hash" on signal with no input and changing your sampling rate.
  2. All real measurements have 1/f noise, which increases as the frequency decreases, which is bad for DC measurements. Not a whole lot you can do to avoid this.
  3. After taking a measurement, get statistics on your DC measurement. Mean and standard deviation, this will tell you how noisy your signal is and if you are moving in the right direction.
Message 5 of 5
(405 Views)