LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQmx latency/synchronization, AO to AI

Sorry - remember this is some made-up code out of a much larger application. We're automatically setting that output voltage and looking for the transient response within the 100ms window. If the edge is at the beginning everything is ok. But when DAQmx mysteriously pushes the edge nearly halfway through the window, we miss the tail end of the transient.

 

The "fix it and be done" answer is to pick an Fs, measure the latency, and hardcode fudge a time offset. This is what I'll probably do. But I'm really bothered not understanding why the edge (phase) would move like that just by changing the sample rate.

0 Kudos
Message 11 of 13
(70 Views)

@OneOfTheDans wrote:

Sorry - remember this is some made-up code out of a much larger application. We're automatically setting that output voltage and looking for the transient response within the 100ms window. If the edge is at the beginning everything is ok. But when DAQmx mysteriously pushes the edge nearly halfway through the window, we miss the tail end of the transient.

 

The "fix it and be done" answer is to pick an Fs, measure the latency, and hardcode fudge a time offset. This is what I'll probably do. But I'm really bothered not understanding why the edge (phase) would move like that just by changing the sample rate.


Set the AO before sampling, it should use its buffer so you don't miss anything.

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 12 of 13
(46 Views)

@OneOfTheDans wrote:

Sorry - remember this is some made-up code out of a much larger application. We're automatically setting that output voltage and looking for the transient response within the 100ms window. If the edge is at the beginning everything is ok. But when DAQmx mysteriously pushes the edge nearly halfway through the window, we miss the tail end of the transient.

 

The "fix it and be done" answer is to pick an Fs, measure the latency, and hardcode fudge a time offset. This is what I'll probably do. But I'm really bothered not understanding why the edge (phase) would move like that just by changing the sample rate.


Understand. I am unsure how you are testing the latency in the real program. I have a general purpose DAQ program and when it runs networked DAQ devices there is latency with respect to display, but these are just artifacts.

 

Attached is a way to test your latency; sorry for the sloppy code. Here a new file is created every time you change the voltage. Since the voltage change occurs between reads, the voltage change should occur within the first 100ms of the file, like shown below. It won't be the same everytime because it depends on a front panel event, but it should always be below 100ms. Can you show us what you are seeing?

 

mcduff_0-1731957538977.png

 

0 Kudos
Message 13 of 13
(33 Views)