LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Elapsed Time VI Lag

I am currently working on a VI where a certain process (e.g. a plot update) needs to occur at a user-specified interval. While waiting for the interval to pass, I do not want the entire program to hang until the interval passes, especially because the interval could be on the order of a minute, which rules out the use of the "Delay (ms)" block.

 

I've been experimenting with using the elapsed time VI hooked up to a case structure by its "Time had Elapsed" terminal, with the elapsed time VI resetting after the user-specified time interval has passed. The block diagram for the proccess would be placed inside the case structure. I'm using a second elapsed time VI to keep track of the actual elapsed time, but upon examining the times reported by this elapsed time VI, I find that the timer actually increments in steps that are greater than the user-specified interval by a factor of approximately 1.0001. I've attached a test program that I made to demonstrate this.

 

I've tried dividing out the factor of 1.0001 from the input to the time target terminal, but that seems to have no effect. I've also tried using the same time elapsed block and keeping a count of how many times the time has elapsed, then multiplying that by the user-specified time interval to obtain the time, but I feel like that only has the effect of tricking myself. I realize that this is a very small lag, but my VI is intended to run for periods on the order of weeks, so the lag could become significant.

 

Does anybody have any suggestions as how I could address this lag?

 

Thanks in advance for any help!

0 Kudos
Message 1 of 5
(2,732 Views)

Hi Griffon2-6,

 

If you want to use LabVIEW you will have to live with the small errors since it's not real-time.  One thing I could suggest is to use the 
"Get Date/Time In Seconds Function" to keep track of the actual elapsed time.  At least in this way you are referenced to your computer clock and not one clock starting with a 1us delay...then starting another clock with 1us delay.

 

Regards,

 

-SS



0 Kudos
Message 2 of 5
(2,715 Views)

@ShotSimon wrote:

Hi Griffon2-6,

 

If you want to use LabVIEW a non-RT platform you will have to live with the small errors since it's not real-time.  One thing I could suggest is to use the 
"Get Date/Time In Seconds Function" to keep track of the actual elapsed time.  At least in this way you are referenced to your computer clock and not one clock starting with a 1us delay...then starting another clock with 1us delay.

 

Regards,

 

-SS



Let's not confuse the OP with something as being a LabVIEW issue when it's not. I'd hate for him to go change languages just because of this and have the same problem! Even on an RT platform there will be some jitter, although it will be much less. 

 

I have to ask the OP, does a difference this small really matter? I doubt your user will be able to tell if a graph updated 0.0001 seconds too late...and if he/she can, I want to meet them!

 

Just read your last sentence. If you timestamp the data when you acquire it, then it shouldn't matter when you graph it, it will be correct. For instance, if you take data and plot it a week later but you were timestamping it as it's taken, it will still show the timestamps from a week before (as it should).

 

The key is to timestamp the data as it's acquired, that way even if it's plotted late, your x coordinates will remain correct and what the user sees visually is also correct. Does this make sense? It's not about when you update your graph when it comes to the data validity, it's about how you timestamp it. If you need the data to be acquired deterministically, then you will need an RT platform. May I ask what you are acquiring from?

0 Kudos
Message 3 of 5
(2,709 Views)

Thanks for the replies everyone. To answer your question, imstuck, I'm acquiring data from a PLC over MODBUS Ethernet.

 

I thought about timestamping the data and using the time offset from the start time to calculate the elapsed time, but I haven't tried it yet. However, this still seems to require that the elapsed time VI is actually giving me the desired elapsed time and not the desired elapsed time multiplied by a factor of 1.0001, which I'm uncertain is the case. If this is not the case, then it seems that timestamping the data will either produce the same results as before, which is obviously not desired, or it will report the desired times, but would be another case of me simply tricking myself.

 

Does anybody know whether the delay observed for the elapsed time VI is an actual delay, or is it just an artifact of the elapsed time VI?

 

Of course, any alternative suggestions as to how I can solve the original problem are welcome too.

 

Thanks.

0 Kudos
Message 4 of 5
(2,668 Views)
You can look at the code for the elapsed time function.

Since the windows time is not going to be accurate within at best 5ms, your expectations are way out of line.
0 Kudos
Message 5 of 5
(2,655 Views)