LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How much jitter should be expected with DAQ using a USB-6218 BNC device for AI and AO?

Hi all,

 

I am using a USB-6218 DAQ device to sample 6 x AI channels (continuously at 2500 Hz) and write 1 AO channel (max of 50 samples at 100 Hz)... The continuous AI signals are then displayed on a PC monitor where the operator will manually trigger the AO when manually triggered (by a front panel button) based on visual changes they see in the AI signals...

 

I have developed the VI using a series of 4 producer/consumer while loops (1. AI DAQ loop; 2. AI plot/display loop; 3. AO write loop; and 4. write data to disc loop) using queues etc.

 

As I said, the DAQ samples the AI data continuously at a rate of 2500 Hz, then reads data in blocks of 250 samples (to ensure a smooth visual display of the signals in real-time on the PC monitor)...

 

My question is, will the sampled Ai and AO data contain any jitter when written to disc? I understand that plotting the data may be processor intensive and have some jitter on the visual display, which may not be detectable by the naked eye (which is fine)...

 

But will the time-stamped AI and AO data that is written to disc accurately align with the 2500 Hz and 100 Hz sample rates?

 

My understanding is that data are sampled by the USB-6218 DAQ device, which buffers the data as needed depending on the data points to read, DAQ memory, PC processor speed, RAM memory etc... So the only issue will be whether or not the the PC system can cope with the resource demands required for the real-time signal plotting on the computer monitor....

 

Could someone please clarify this for me?

 

Many thanks...   

 

     

0 Kudos
Message 1 of 10
(1,540 Views)

Hi jcannon,

 

the sampled data will have some jitter as is described in the specs of your DAQ device.

Once the data is sampled the data is "fixed" and will not change anymore…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 10
(1,523 Views)

Just to expand a little bit on GerdW's answer...

 

1. Time between samples for your AI data is controlled by a clock on your 6218 device.  Yes, there will be some jitter in timing but it will be measured in nanoseconds and negligible for 99%+ of applications.

 

2. AO sample timing will also be controlled by a clock on the device, so the same considerations apply for jitter.

 

3. However, *alignment* of data might be a trickier question to answer, depending on exactly what you're looking for.  The buffers involved in DAQ tasks necessarily introduce latency.  When you read the AI data in chunks, the samples you receive were taken at different points of time in the past.  When you write AO data to your task, samples need to pass through 3 buffers (task buffer, USB transfer buffer, device FIFO) before they become real-world signals at different points of time in the future.

 

AI data itself will have negligible jitter.  AO signals themselves will have negligible jitter.  But the *latency* between them is a much more difficult thing to predict or control with precision.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy coming to an end (finally!). Permanent license pricing remains WIP. Tread carefully.
Message 3 of 10
(1,480 Views)

Thanks, Kevin... I understand what you are saying...

 

In terms of aligning the signals, the AO signal after leaving channel AO_O is split in two using a BNC T-connector where the signal goes to trigger an external device and the other signal feeds back into one of the AI channels...

 

So the timing/delivery of the AO signal is sampled as one of the AI signals... So I assume that the alignment of all AI and AO signals should be near perfect (less any jitter associated with DAQ clock)...?

 

Would this be correct?

 

  

0 Kudos
Message 4 of 10
(1,454 Views)

Hi GerdW,

 

Thanks for your reply...

 

I just replied to Kevin's response... Would you mind having a look and letting me know your thoughts?

 

Many thanks...

0 Kudos
Message 5 of 10
(1,452 Views)

Kevin, re: your caution...

 

The subscription model is a killer...I am a university professor and use LabVIEW for my academic research... Currently, I am still using 2015SP1 which continues to work on WIN11... Otherwise, it would be $AU2k per year for me to continue using it...

 

I am seriously considering moving to MATLab as all NI DAQ devices can still be used and my university already has a MATLab licence...

 

Such a shame....

0 Kudos
Message 6 of 10
(1,448 Views)

@jcannon wrote:

I am seriously considering moving to MATLab as all NI DAQ devices can still be used and my university already has a MATLab licence...

 


Just a note, "can still be used" may not be sufficient for all applications; MATLAB's support for NI DAQ is quite limited as a lot of the DAQmx driver features are abstracted, and you may not be able to fully utilize the DAQ you paid for.

Santhosh
Soliton Technologies

New to the forum? Please read community guidelines and how to ask smart questions

Only two ways to appreciate someone who spent their free time to reply/answer your question - give them Kudos or mark their reply as the answer/solution.

Finding it hard to source NI hardware? Try NI Trading Post
0 Kudos
Message 7 of 10
(1,444 Views)

Thanks, All keep that in mind and make sure to investigate this before making any decision...

 

We only use USB NI DAQ devices and I understand they should be fine...

0 Kudos
Message 8 of 10
(1,441 Views)

@jcannon wrote:

So the timing/delivery of the AO signal is sampled as one of the AI signals... So I assume that the alignment of all AI and AO signals should be near perfect (less any jitter associated with DAQ clock)...?

Yes with a possible small asterisk.  The 6218 is a multiplexing device where a single physical A/D converter is time-shared amongst all the AI channels being sampled.   Deeper down, there's a 2nd clock known in DAQmx as the "convert clock" that controls the channel-to-channel time interval within a sample. 

 

By default, DAQmx will choose a rate that spreads out the channel conversions as far apart from one another as possible within one sample interval (and across adjacent ones).  This is often a good default as it gives max time for charge to bleed off from one channel's conversion and not influence the next channel.  

 

It's possible to either set or query the AI convert clock rate with a DAQmx Timing property node when there's a need to control or know it.  Many apps and systems don't really require such detailed considerations, but some do.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy coming to an end (finally!). Permanent license pricing remains WIP. Tread carefully.
0 Kudos
Message 9 of 10
(1,399 Views)

Thanks, Kevin... The default settings should be ok for our applications... Thanks again for your explanation... That is very helpful...

0 Kudos
Message 10 of 10
(1,364 Views)