03-18-2013 04:06 PM
Hello
I am trying to capture a time stamp of some sorts whenever the DAQ receives a pulse, I have two methods (attached) which are giving slightly different times. One method is using the "get time stamp" function while the other is using the "Tick Clock". The format of the time stamp does not matter too much but the accuracy does, so I am wondering which one is more accurate. Also, this VI will most likely be utilized on a real time system (Labview RT) if one VI or the other cannot be used in real time then that would take precedent over the accuracy.
Thank you
Solved! Go to Solution.
03-18-2013 04:30 PM
Timing accuracy is not very good with either method. The Timestamp uses the computer's time of day clock. If you were using computers before time servers became available, you may recall that computer clocks drifted seconds per day or more. Over the long term (weeks to years) the timestamp will be, on average, more accurate because of the continual resetting by the time servers. However you may see jumps due to those resets.
The Tick count is derived from an internal oscillator in the computer. Because the computer is more concerned about synchronizing logic across the CPU (and peripheral chips) thatn timing accuracy, these oscillators are neither highly stable nor accurate.
The oscillator in your DAQ device is almost certainly more stable and probably more accurate than either of the internal clocks available.
Get the signal from the DAQ as a waveform and use the t0 time from that. Short of setting up your own atomic clock or GPS derived clock, that it probably the best you can do. It also is most closely linked to the acquisition as opposed to the timt the data is read from the DAQ device.
Lynn
03-18-2013 05:02 PM
Hello Lynn
Thank you for your timely and insightful response, I will try capturing the time the way you mentioned.
Thanks
Stephen