05-02-2009 03:37 PM
Reading a timestamp from a PXI-6682 in LabVIEW (see attached .vi) has a variable
duration latency, on average close to 60ms. At clock frequencies above, say 40Hz,
this latency causes the timestamp buffer to overflow. How could one reduce that
latency and avoid the timestamp buffer overflow?
If one sets a clock event under the PXI-6682 device in MAX at a frequency of 100Hz
and timestamps it from MAX, all works nicely, no buffer overflows.
05-04-2009 03:31 PM
Hey Tihomir,
In looking at your program, it seems like you're trying to timestamp every pulse of a clock. Is there any reason you need to timestamp every single clock pulse? You should be able to call the timestamp function once, and using the frequency value you set calculate the timestamp values from there. The latency probably occurs because you're software-timing your timestamp acquisition with your while loop. I believe that Measurement and Automation Explorer doesn't timestamp every clock pulse, and instead uses the clock value to calculate the timestamps.
05-04-2009 04:01 PM
Dear Justin,
Many thanks for looking into this matter. Is there a code example you could
direct me to, so I could see how to do what you propose. I am not aware of a
way to associate a counter with the clock, so one could read the clock
and recalculate the timestamps.
I appreciate your help,
Tihomir
05-05-2009 02:22 PM
Hey Tihomir,
What exactly are you trying to do with your application? It might be easier for me to make suggestions if I knew what you were trying to do. Do you simply want to timestamp a clock?
There's not a direct code example I can link you to that does what I am talking about. To clarify a bit more, lets say you want to generate timestamps for a 50Hz signal. The oscillator on the 6682 has an accuracy of 1 part per million. This means that if you're generating a 50Hz signal from that reference, your signal will actually be 50 +- 0.00005Hz. This is a relatively accurate signal. If you were to take a timestamp at the beginning of your clock generation, you would be able to mathematically calculate the time for each of the pulses. Lets say the timestamp taken at the first clock pulse occurs at 1 second. The second clock pulse would occur at 1+(1/50Hz) seconds, the third clock pulse would occur at 1+2*(1/50Hz), and so on. There's not really a need to timestamp every single clock pulse, as you can infer what the timestamps are going to be.
If you're just trying to take multiple timestamps off of one signal, take a look at Time Stamp Clock-Cont.VI from the NI Example Finder. To find this example, open LabVIEW and navigate to Help->Find Examples...
Then, make sure the Browse tab is selected, and navigate to Hardware Input and Output -> Timing and Synchronization -> Time-Based -> Time Stamp Clock-Cont.VI
05-05-2009 03:22 PM
Hey Justin,
I am working on a data acquisition application that includes collecting
data from 20 serial ports, an inertial navigation device connected over
ethernet and a large number of analog lines. To synchronize these
data with data collected by other remote (non-NI) systems, I was
hoping to use a GPS-referenced timestamp for each sample I
collect. The signals are varying slowly, so it suffices to get a time
resolution of 100Hz.
I am not aware of any documented counters in PXI-6682.
Again, if in MAX, under the PXI-6682 device, I create a clock event
with a frequency of 1kHz, the timestamping in MAX works without
error messages or buffer overflows.
Many thanks,
Tihomir
05-05-2009 04:06 PM
By reading only one point at a time your application will be prone to overflows. I would suggest reading multiple timestamps at a time say 10 or 100 depending on your clock frequency. You can still feed the array of timestamps into a for loop if you need to process one at a time. You can also increase the size of the timestamp buffer size and monitor the number of timestamps in the buffer to help prevent overflows. Some customers will wire the "Availible Time Stamps" property into the "number of timestamps" input for the niSync Read Multiple VI.
-Josh
05-05-2009 05:19 PM
Josh,
Thanks for looking into this problem. My original idea was to have a
global timestamp variable deterministically updated, say, 100 times
per second. Thus when data arrive in a serial port buffer they can
be timestamped from that variable. However, if the latency of 60ms
remains, one will not be able to timestamp with a resolution of 10ms.
05-06-2009 10:10 AM
Hey Tihomir,
I'd recommend trying to open that example I pointed you to, and also implementing some of the things Josh discussed to see what kind of latency you get.
05-08-2009 12:41 PM
Hi Justin,
I followed your and Josh's advice and started reading multiple timestamps
(see attached vi). Because of timestamp array indexing in the vi, the
latency has slightly increased, to about 68ms.
The ultimate question is how does one GPS-timestamp serial port
data with a resolution of 10ms or so.
Thanks,
Tihomir
05-08-2009 01:21 PM