LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Problem with using ms timer for serial data timestamps

Hi, Im having a problem when using the ms timer as timestamps for my data. What I would like to do is record the current ms timer value at the time that data is read from the serial port. So in my vi below, I am continuously reading data through the serial port, then recording that data in a circular buffer, and I want to record along with each datapoint the current value of the timer. Unfortunately, as you can see in the images below, for some reason, if I place a timer within the loop that is reading serial data, I do not get a smooth linear ramp for my timer. The timer appears to jump every so often, resulting in a step-like ramp. In the third picture I have overlaid two timers, one outside the serial loop, the other inside, and as you can see, although both end at the same value, the one in the loop looks like it needs to 'catch up'.

 

Can anyone explain to me what causes this, and suggest any possible method to fix this? Because at the moment since the ramp is not linear, if I plot an x-y graph consisting of the serial data against the recorded time, all these points bunch up due to the step-like time waveform. I know for a fact that the data from the device is a 14Hz square wave, so I know that the time is not being recorded properly. Is there a possible way to fix this, or to obtain some sore of more reliable timestamp? I need a timestamp with each datapoint, because sometimes I lose packets, and I would like to do some frequency analysis with the data, so I would like ms-level timestamps.

 

system.jpg normal.jpg

overlaid.jpg

0 Kudos
Message 1 of 7
(3,478 Views)

Have you considered that the problem isn't with the msec timer?

 

It could be windows, you cannot rely on Windows to execute down to the millisecond level.

 

Also, it could be your code.  Perhaps sometimes the code inside the while loop may be taking longer in one iteration to execut than another.  It may be in the implementation of the circular buffer VI.

 

Also, there are times where most of that code doesn't execute at all.  What happens if the returned data is equal to 17 bytes?  The true case doesn't execute.  And also whatever data you did get will be lost.

0 Kudos
Message 2 of 7
(3,452 Views)

Take a look at the value of your BYTES READ indicator.  Is it ALWAYS = 17?

 

But a probe on it and break if it's not, or break inside the FALSE case.

 

What happens if you receive 18 bytes, say, one whole packet and the start of another.  Your code throws them both out, and starts over. But the next packet may be only 16 bytes, so you throw it out too, in an attempt to get back in sync.

 

The BYTES READ indicator will tell you if that's the case. 

Steve Bird
Culverson Software - Elegant software that is a pleasure to use.
Culverson.com


LinkedIn

Blog for (mostly LabVIEW) programmers: Tips And Tricks

0 Kudos
Message 3 of 7
(3,442 Views)
Thanks for the help, it seems the problem is due to the execution rate of both loops, the data reading loop is slower not due to the circular buffer, but the serial read. I am attempting to fix it by synching the two loops. Also, when you say break, do you mean serial break? I'm not sure waht the serial break vi does.
0 Kudos
Message 4 of 7
(3,428 Views)

when you say break, do you mean serial break?

No, I mean set a breakpoint on that condition, so the program stops and you can see what values are where. 

Steve Bird
Culverson Software - Elegant software that is a pleasure to use.
Culverson.com


LinkedIn

Blog for (mostly LabVIEW) programmers: Tips And Tricks

0 Kudos
Message 5 of 7
(3,426 Views)

nicrip wrote:
Thanks for the help, it seems the problem is due to the execution rate of both loops, the data reading loop is slower not due to the circular buffer, but the serial read.

Actually, that probably makes perfect sense.  It depends on your baud rate.  What is it?

 

I'm going to assume you are requesting 17 bytes in your bytes to read since you only do something if you get exactly that many bytes.  I'm going to assume a typical baud rate of 9600.  (Yours may be faster or slower, and that would affect the calculations.)  9600 bits per second works out to about 960 bytes per second (depending on data bits, parity and # of stop bits, 8, none, 1 would be 10 bits per byte).  960 bytes per second means 1 byte takes a little over a millisecond to come in.  So 17 bytes are going to take about 17 milliseconds.  Thus, your loop won't iterate until 17 bytes have come in or 17 which means 17 milliseconds have passed.  So of course you are going to see some steps in the millisecond timer when you plot the timer per iteration.

 

How often until does the data come in?  What is your timeout value on the com port?  All of those things will affect the iteration rate of your code as well.

0 Kudos
Message 6 of 7
(3,415 Views)
My baud rate is 56700, and I read packets of 17bytes at a time. I realize now that the time at which the data is read by pc may not necessarily correspond to the time it is sent by the device. Since the device is sampling at 250Hz, and sends data continously, I think it might be best to simply not use timestamps and just assume data comes in at a constant rate. I could then just count the number of packets per second every second, then just use this number to continuously update (every second or so) the time separation between datapoints. This would probably be accurate enough. Thanks for the help, and if you have any ideas please post.
0 Kudos
Message 7 of 7
(3,406 Views)