LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Troubles with Frequency Measurement and Time Stamp

Hey guys, I'm new to LabVIEW and have been working on a task for a while now. I'm trying to measure velocity with my encoder at a sample rate of 1 kHz. I came across some code that calculates velocity using angular position, but due to the sample rate, I couldn't get it to work. So, I started exploring frequency acquisition and thought buffered measurement might be a good option.

 

My motor runs at around 1800 RPM, and my encoder has a resolution of 1024 pulses per revolution. This led me to estimate a frequency of around 30 kHz. I began using the Low Frequency method with 1 counter, where I generate a clock with another counter of my DAQ and connect it to the gate of the counter that is also connected to channel A of my encoder.

 

However, for my application, I need to generate a txt file with frequency information and a column for the time of acquisition of each sample. When I perform the frequency measurement, I use the block "DAQ Read" with the option "Counter 1D DBL NChan NSamp," which doesn't provide time information. To address this, I used the "Build WFM" block, set the start time to zero, and dt=1 ms. It kind of works, but not really. I'm generating the x-values for my graph, but I need to be sure that the timestamp is 1 ms because I'll have to match these values with other measurements later.

 

To verify this, I tried measuring the final time of the waveform and calculating if the rate matches 1 kHz based on the number of samples. However, I noticed that the x-axis values are related to the number of samples. The final time of the simulation is the total number of samples I set for one loop, and also, every time this number is reached, the "time" that I generate resets in the txt file.

 

Can someone help me understand what I'm doing wrong or suggest an easier way to achieve what I'm attempting? I'm using the NI USB-6341 DAQ. Thanks for the help,

 

Flavia.

Download All
0 Kudos
Message 1 of 3
(600 Views)

I can't open your new 2023 code, but here's a link to an old post of mine that should help you solve your timestamp problem.  The code over there is for period measurement, but since period = 1/freq, it should be pretty trivial to adapt.

 

The basic idea is to do a cumulative sum of the intervals you're measuring.  There's a small subtlety involved in defining the time=0 point, but that only matters if you need precise correlation between these freq measurements and some other DAQ task that uses constant-rate sampling.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy coming to an end (finally!). Permanent license pricing remains WIP. Tread carefully.
0 Kudos
Message 2 of 3
(552 Views)

Hi Kevin,

 

Thanks for your response. So, if I get it right, the code you provided calculates the period of the signal that the counter is reading, and then performs a cumulative sum to register continuous time for each frequency measurement.

 

In this case, I should use the clock signal (1kHz) that I generated for sampling, instead of using the signal from my encoder, is that right?

0 Kudos
Message 3 of 3
(506 Views)