LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Can I rely on DAQmx timing?

I use compactDAQ to acquire data from multiple modules. I will be running ~100 hours measurements and the IT guy wants to pair each sample with a timestamp value precisely. So before I save data to TDMS, I take t_0 and dt of the waveform acquired AT THE MOMENT and create a time array with for loop. This way, I save the timestamp value with each signal value. It is the most transparent way for me as I kind of believe that it is safe solution.

 

But how does DAQmx solve the HW timing discrepancy over long time? Is it guaranteed that if I took the first t_0 (the one that is saved to wf_start_time TDMS property) and calculated the timestamps for ALL SAMPLES OVER 100 HOURS width dt, I would get to the same t_end from multiple sources? It isn't, right? It would be great if I could just calculate timestamps from all the data after finishing the measurement, from just t_0 and dt.

0 Kudos
Message 1 of 14
(2,021 Views)

Hi Vit,

 


@VitSlaby wrote:

I use compactDAQ to acquire data from multiple modules. I will be running ~100 hours measurements

 

But how does DAQmx solve the HW timing discrepancy over long time?


When you read the specs of the cDAQ (like this) then there is a timing accuracy spec given…

Your cDAQ usually should stay within that spec!

 

What is your definition of "precisely"?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 14
(2,011 Views)

If I'm reading right, the timing accuracy 50 ppm for cDAQ means up to 18 seconds in 100 hours of measurement. Just to make sure, is the mentioned timer responsible for the DI, DO, AO, AI measurement timing? (The Sound and Vibration modules use their own timer, right?)

 

Then my question changes a little bit - from your experience, what is the best way to overcome this without the need to calculate timestamps each DAQmx Read? My idea is to save t_0, t_end and number of samples and then linearly interpolate in between. From your experience, could this be close to the truth, or does the discrepancy change a lot over time?

0 Kudos
Message 3 of 14
(2,005 Views)

[Edit: started this in response to your initial question.  By the time I posted, you had followed up with the same solution I arrived at.  So I'll endorse your method and leave the rest of the post below to give future readers some more context for the thought process behind it.]

-------------------------------------------------------------

Short answer: your intution is right, there IS a lurking issue with time disagreement between DAQmx and your PC's time-of-day clock.

 

There's not a simple perfect fix, but you can do some very very good approximations.  However, there's some things to know before you navigate toward your preferred solution.

 

1. How DAQmx deals with t0, dt: 

    The initial t0 is established one time only when you make your first DAQmx Read.  Based on the known sample rate, the # of samples acquired so far, and a one-time query to the real time-of-day clock of your PC, a best estimate for an initial t0 is established.  After that, all subsequent t0's are calculated from a combo of the initial t0 and the known dt.

 

2. Accuracy of DAQmx's t0, dt:

    t0 will normally be quite good.  No specs or guarantees here, but I suspect we're talking sub-millisec realm.  dt is also quite good with many common devices spec'ed to 50 ppm timing accuracy.  At the spec limit, this is 3 msec per minute which leads to about 5 seconds per day.  Any actual devices I've checked out have appeared to be less than 1/2 of this.

    Generally speaking, dt will be *very* consistent.  If slightly inaccurate, it will generally be a consistent inaccuracy.  There are small temperature dependencies, but long-running tests tend to be near-steady-state for temp.

 

3. Accuracy of PC's real time-of-day clock:

    Absolute time will be quite good.  The clock itself is subject to the same kinds of inaccuracies as DAQ devices under DAQmx, but modern-era OS'es have things like NTP to prevent long-term timing drift (assuming a network connection).  The local clock is regularly kept in alignment with a trusted network time source.

    The "dt" of this clock is less consistent as it regularly gets tiny corrections to stay in sync with the time server it finds out on the network.  The magnitude of this variation will be quite small but it's prone to change (slightly) pretty often.

    Note: I was around in the old days of DOS (and early Windows too I think) where network connections were not ubiquitous, and it was commonplace for PC clocks to drift noticeably over the course of weeks & months.  Back then, we all regularly saw the issues of cumulative inaccuracy and drift.  Nowadays, it's largely hidden away and many aren't even aware.  So good job anticipating this!

 

4. Hardware vs software time

    The PC's real time-of-day clock must be queried by a software call.  DAQmx's sample clock is a hardware pulse train derived from a fixed-frequency crystal oscillator. 

    Each individual query of the real time-of-day clock carries a little jitter in both t and dt as a consequence of the software query.  t itself is prevented from cumulative error over the long run via NTP.

    Under DAQmx, dt has very little jitter but it's small inaccuracy *does* accumulate over the long run.  And this shows up in each waveform chunk's t0 value which is *calculated*.

 

5. Suggested practical solution:

    During your 100 hours, treat the waveform t0 and dt as though perfect.  Don't mess with them or adjust them.  Just store your data and act like you believe them.

    However, *also* be sure to carefully query the real time-of-day clock at both the beginning and end of the acquisition.  Do this *very* carefully, thinking about dataflow, execution sequence, and possible "off by one" errors when assigning that specific time to a specific sample.  Then you'll have a 2nd measure of the delta time for the entire test and a known number of sample intervals between them.  Now you can calculate an *actual* average acquisition dt and also store *this*.

    If you were careful enough, this measure of dt is likely just a little bit more accurate than what you got from DAQmx over the course of a 100 hour test.  (The method wouldn't be such a great idea for shorter tests where the non-cumulative jitter of software timing could be larger than the cumulative DAQmx clock inaccuracy.)

    During subsequent post-processing, use your better estimate of dt in place of the one from DAQmx.

 

Final thought:  don't read this as an indictment of DAQmx or the data acquisition clocks.  They're inherently pretty much on par with the PC clock for accuracy.  It's just that the PC gets extra help by getting regular adjustments to align with a network time server.  Which makes sense for an overall PC in a way that it *doesn't" make sense for a data acq device.  A lot of data processing algorithms assume and depend on consistent sample intervals.  So a data acq device needs its clock to operate at a regular and consistent rate and *not* try to adjust it mid-stream.

    Hey, it's the real world, where "good enough" is often a more useful target to aim for than "exactly correct."

 

 

-Kevin P 

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 4 of 14
(1,992 Views)

Kevin has provided an excellent write-up.

 

If you need the best time accuracy, you could go with PXI system with OCXO clock source with ppb accuracy and even GPS time sync. In reality, you're always limited by budget, you could only get along so far.

Santhosh
Soliton Technologies

New to the forum? Please read community guidelines and how to ask smart questions

Only two ways to appreciate someone who spent their free time to reply/answer your question - give them Kudos or mark their reply as the answer/solution.

Finding it hard to source NI hardware? Try NI Trading Post
Message 5 of 14
(1,972 Views)

Are you using the 9189 or 9185 cDAQ chassis by any chance? Those chassis should have a steerable oscillator that can be disciplined by their 1588/802.1AS capable NIC which might give you the behavior you want.

 

As long as the computer and cDAQ are both slaved to the same grandmaster (or if one of them is the GM) then the DAQmx data should be correctly correlated to 1588 time without any drift (and with 1us accuracy).

Matt J | National Instruments | CLA
Message 6 of 14
(1,943 Views)

Both santhosh and Jacobson offered up improvements on the scheme the OP and I settled on.  If you have the right cDAQ chassis, I'd focus especially on the suggestion from Jacobson.

 

I have some awareness of the timing standard he referenced, but no personal experience setting it up and managing it's configuration.  I know a colleague went through struggles dealing with a 3rd party Linux box that was supposed to act as the 1588/802.1AS (aka "PTP", at least I *think* their synonomous) timing master.  My rumor-level understanding is that eventually, the problems could largely be ascribed to a faulty PTP implementation on the Linux box and that the NI and Windows side was not a significant headache.

 

So, pleading ignorance, I'll leave it to others to offer more specific thoughts about using that kind of timing standard and the hardware to support it.  (Note: this probably includes adding a new NIC to your PC -- it's pretty unlikely that the built-in one would support this standard.)

 

Meanwhile, let me also be a little voice of experience as a problem-solving engineer.  Everybody wants everything until they find out they have to foot the bill.  Sometimes the bill is $, sometimes it's time, sometimes it's quality/reliability, often it's a combo.

 

A pretty good solution that's well within your understanding that you can deploy immediately has a certain value compared to cost of new hardware, delivery and development lead time, and learning curve risk.  There are absolutely cases where the extreme precision is needed and worth it.  But I've also absolutely seen times that "customers" have backed off on their requirements by 2 orders of magnitude when they were educated about the associated cost & risk.

 

Have a discussion with your "IT guy" about the timestamp accuracy requirements.  What are the specs, what is the rationale, and how much $, time, or risk is he willing to offer up to achieve them?

 

 

-Kevin P

 

P.S.  It very well may be the case that the 1588 timing standard approach will be worthwhile to pursue.  Not trying to talk you *out* of it.  Just can't vouch for it personally, and also catching a whiff of wishes that are mislabeled as needs.

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 7 of 14
(1,927 Views)

@Kevin_Price wrote:

2. Accuracy of DAQmx's t0, dt:

    t0 will normally be quite good.  No specs or guarantees here, but I suspect we're talking sub-millisec realm. 


On a PXI-1082 chassis that I tested, I would disagree. Tested a PXIe-6366 card and a PXIe-4499 card. Both cards had different sampling rates, phase-locked to the 10MHz clock in the back plane, and were digitally triggered. Because they had different sampling rates, there were two tasks. (Basically modified a VI from the example folder for synchronization.) Looking at the reported t0 for each waveform I found a 1-11 ms difference in reported t0 values between devices with a cluster between 2-4 ms. 

 

For a Windows systems I would say t0 probably within milliseconds, for a RT system maybe better, but haven't tested it.

Message 8 of 14
(1,914 Views)

To (mainly) mcduff:

 

1. Thanks for the correction.  Data's better than guesses!

2. I wonder whether the inherent delay in the 4499's delta-sigma converter is playing a role here?  Maybe DAQmx isn't automatically compensating for that?  

3. I also wonder whether there's also a contribution due to differences in default trigger delay?

4. Even if so, it makes *your* point that t0 is less trustworthy than I had figured.

5. Intrigued now, I'll try to test tomorrow too.  I can get at a PXI system with an embedded Windows PC and 2 identical 6361 cards.  I want to try syncing with just a shared sample clock, no triggers, no delta-sigma converters, and see what I get.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 9 of 14
(1,898 Views)

@Kevin_Price wrote:

To (mainly) mcduff:

 

1. Thanks for the correction.  Data's better than guesses!

2. I wonder whether the inherent delay in the 4499's delta-sigma converter is playing a role here?  Maybe DAQmx isn't automatically compensating for that?  

3. I also wonder whether there's also a contribution due to differences in default trigger delay?

4. Even if so, it makes *your* point that t0 is less trustworthy than I had figured.

5. Intrigued now, I'll try to test tomorrow too.  I can get at a PXI system with an embedded Windows PC and 2 identical 6361 cards.  I want to try syncing with just a shared sample clock, no triggers, no delta-sigma converters, and see what I get.

 

 

-Kevin P


I used the property "remove filter delay" for the DSA card, both True and False, no difference in t0 differences. The two cards were definitely synced together; if I plotted signals versus relative time, both t0's set to 0, the waveforms lined up nicely.

 

If you have multiple tasks that are synced I don't think the t0's will ever line up exactly. If you need the absolute time you have a few options:

  1. Trigger the system at a known time (for example with a GPS), then change t0 after the acquisition. If you have a spare channel, you can digitize a PPS from a GPS to see if your sample clock is drifting.
  2. Switch to NI-Scope. NI-Scope cards have hardware time stamping, but the API is not as nice as DAQmx.
  3. Someone like yourself (@Kevin_Price) who is a counter genius can probably make a counter synced to a GPS conditioned 10MHz clock and use that clock for sampling.
0 Kudos
Message 10 of 14
(1,891 Views)