11-14-2024 03:44 PM
I'm having trouble identifying a latency between two simple DAQmx tasks. Please let me know if you have any insights!
There is a simple Continuous Samples voltage read task. A loop pulls samples off the task buffer every 100ms (timed by DAQmx Read). After each chunk it writes a single On Demand voltage to the other task. When I change the voltage on the GUI, it gets clocked out after the read, and the new voltage is measured during the next 100ms read window.
When I step the voltage, it shows up in the 100ms window with some latency. I expect this because the output is On Demand and DAQmx buffer is always a little behind. Here's where I'm stuck: the latency is extremely dependent on the read sample rate! 50 kSa/s might cause 5ms latency, but 2 kSa/s can push the edge out to 30ms. Why would the read sample rate affect where the edge lands? If anything, I would expect faster sample rates to delay the edge more, because the buffer is slower to transfer with more samples; the opposite is true!
Any good ideas why this is happening? I'm also open to different timing/synchronization schemes, but at this point the drifting latency really has its hooks in me.
Thanks,
Dan
11-14-2024 11:24 PM
Please post your test code, preferably saved back to LV 2020 or earlier. That'll probably make things clearer quicker than going back and forth with words.
-Kevin P
11-15-2024 09:14 AM
Yep, fair enough. I've been staring at it long enough it's simple in my mind, but I know a picture's worth a 1000 words!
This is a little mockup of the code. Of course, the real code is intertwined within a huge application.