LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Two different sampling frequency for channels to improve the iteration speed

Solved!
Go to solution

Hi NI community,

 

I have made a code in LabVIEW that reads from DT card using LV-link in LabVIEW. My code reads from 7 channels every second at a sampling rate of 10kHz and performs some tasks on those signals before writing those to two separate .lvm files.

I need 3 of these channels to be at 10kHz sampling frequency but for the others, the sampling rate of 1Hz is enough. Currently, I am using an average function to reduce 10kS to 1S for each of these four channels.

The problem that I have encountered is that each iteration of this will take 7 seconds rather than 1 second. However, when I only read 3 of these channels, I am able to perform each iteration in 1 second. 

So I would like to know if there is any possibility to write a statement function to read 3 channels at a time with a sampling rate of 10kHz, followed by 4 channels at a sampling rate of 1Hz while maintaining 1-second iteration?

Are there any other alternatives to reduce the iteration time?

 

Thanks in advance,

0 Kudos
Message 1 of 4
(3,241 Views)

You are not saying which type of card you are using, but unless it is a very expensive high end card it hardly will support two independent sampling rates that can run in parallel.

 

That all said sampling 7 channels with 10 kHz certainly doesn't seem like pushing the limits of modern computers, if the data processing that you perform is written decently. Since you only posted your VI in 2017 version I can't currently open it on this machine so I can't say if that is the problem, but the fact that you have put everything in one single VI suggests that the whole architecture is anything but streamlined and with a cleaner architecture and some decent optimization it is likely going to be a lot faster. That is unless the Data Translation API layer is worthless and uses up most of the processing performance for transferring this pretty decent amount of data into your application.

Rolf Kalbermatter
My Blog
0 Kudos
Message 2 of 4
(3,196 Views)

Here are the screenshots of the VI architecture that I am using. My DAQ device is DT9803 USB Measurement module. I am not too sure where I could make some modifications to improve the code but I truly appreciate it if you could point me in the right direction. It might be due to the fact that the DAQ device has a 100 kHz overall sampling speed and I am performing this task at 70 kHz? 

0 Kudos
Message 3 of 4
(3,192 Views)
Solution
Accepted by kamranesmaeili

Hello kamranesmaeili!

 

I looked at the VI you posted and I believe I found a solution for you: It has to do with the DtOLTimingSetInput.vi from the LV-Link library you use to set the timing properties that DAQ device. I found this on the corresponding help page:

"If your device has multiplexed analog input channels, you can determine the sampling frequency per channel as follows: Sample Frequency per Channel = Sample Freq (Hz)/Number of Channels."

The DT9803 manual shows that this device has only multiplexed analog inputs. So you have to set the Sample Frequency (Hz) to 70kHz in your case to achieve the 10kHz sampling for all of your seven channels.


Ingo – LabVIEW 2013, 2014, 2015, 2016, 2017, 2018, NXG 2.0, 2.1, 3.0
CLADMSD
Message 4 of 4
(3,156 Views)