LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Sampling rate X6341

I am tying to collect samples at the highest rate possible in order to determine the compression signal value I need.

 

The input task (with 4 analog inputs) is set at 1k samples at a rate of 10k (for now). I am writing the data to a text file using a case structure and write to data blocks.

{So,the program collects 1k samples at a rate of 10kS/s and then repeats (since it is in a while loop), right? So, the time between samples being written to file should be 1/10k, 10 microseconds, right? Should I bring the samples down to 2 (the minimum) so that the program is executed faster?}

 

However, the timestamp between the two samples give a time difference in the order of 2000 microseconds instead of 10 microseconds. Why is that? Is it the limitation of the write capablity of the module?

 

If I do not write it, can I expect to get better rate?

 

Using Labview 2012
0 Kudos
Message 1 of 8
(2,932 Views)

Hi libindaniel2000,

 

I want to get a little more information before answer questions prematurely. Are you taking continuous samples, or a finite acquisition? Could you possibly post your code, or a VI snippet of what you are doing so I can see it?

 

Which timestamp are you talking about? Are you saying the samples are only being written to the text file every 2 ms, or your hardware is only taking samples every 2ms? It is unlikely that you would be able to actually write to an external file at a rate of 10us, though 2ms may be feasible. Ten microseconds could, however, be the time between samples being acquired by the hardware.  

 

Also, which form factor are you using, (USB, PCI, PCIe, etc.)? 

Best Regards,

Thomas B.
National Instruments
Applications Engineer
0 Kudos
Message 2 of 8
(2,898 Views)

Thank you for the reply, Thomas B.!

I am taking finite samples. I will post the VI later today.

I am using the timestamp block and writing it to file with other inputs so that I can figure out the time difference between each sample. So, the time difference of 2 ms is the time between each sample written.

I am using USB.

So, there is no way to tell the time difference between samples being acquired by the hardware since it is hardware dependent and would vary everytime the program is run?

 

Using Labview 2012
0 Kudos
Message 3 of 8
(2,895 Views)

Hi libindaniel2000,

 

When using a USB device you will not be able to explicitly time stamp your data using the hardware. However, there are ways to get time information through software. That being said, the precision and accuracy that can be acheived through software is limited. 

 

If you acquire the data as a waveform, there is a dt that is inherently associated with each sample in the waveform that can be used to determine the time the sample was acquired if you know the start time of the acquisition. This community example shows a method of implementing this approach. However, depending on how accurate you really need the time information to be, that may not be too helpful. 

 

 

 

Best Regards,

Thomas B.
National Instruments
Applications Engineer
0 Kudos
Message 4 of 8
(2,879 Views)

Here is the attached code. Sorry about the delay. Should I set the number of samples to read as 2000? That way, I don't have to worry about the loop execution rate?

Using Labview 2012
0 Kudos
Message 5 of 8
(2,867 Views)

Hi libindaniel2000, 

 

Ultimately you will be limited by the execution rate of the loop, rather than the number of sample being read, even at 2000 samples per loop. Also, the time stamp associated with each sample will be associated with when the sample is written to file, not when the sample was taken. The method in the example I linked in my previous post will get time stamps closer to the time the sample was taken. Have you looked through the example?

 

 

Best Regards,

Thomas B.
National Instruments
Applications Engineer
0 Kudos
Message 6 of 8
(2,854 Views)

Thanks for the reply, Thomas B.! I figured that the loop execution rate is going to define the sampling rate, but I just wanted to make sure. I will implement the time stamp technique from the example and update the VI.

 

The purpose of this test is to figure out how many samples we need to collect so that we can get steady data. So, I am collecting this data and then finding the mean and median and obseving the decay, so that I can pick out the value I need for the compression signal block. Am I on the right track?

Using Labview 2012
0 Kudos
Message 7 of 8
(2,846 Views)

Hi libindaniel2000,

 

From what I understand, it looks like you are on the righ track. Good luck!

Best Regards,

Thomas B.
National Instruments
Applications Engineer
0 Kudos
Message 8 of 8
(2,835 Views)