LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Minimum elapsed time resulution

Is it possible to achieve elapsed time resolution of less than 1 ms in LabVIEW 7 Express?
0 Kudos
Message 1 of 9
(11,165 Views)
> Is it possible to achieve elapsed time resolution of less than 1 ms in
> LabVIEW 7 Express?

The OS resolution for putting threads to sleep is currently 1 ms, and
the accuracy of when the threads varies between the versions and setup.
The thread timing will be pretty good in the general case, but can
have big lags when the user is moving the window or
minimizing/maximizing things.

For smaller delays, you might want to do a busy loop. Make an empty for
loop that runs for several million iterations. Build an initialization
phase that lets you tune it to your computer, and it should be pretty
reliable. But keep in mind that this will delay by consuming the CPU
for a period of time, not by releasing it to the OS, so the more this
delay is
used, the higher the CPU usage will go.

Greg McKaskle
Message 2 of 9
(11,164 Views)
Thank you very much for your answer.
However, I am interested in knowing the time taken by some part of my labview code down to 100 microsecond resolution, rather than delay. Is it still impossible?
0 Kudos
Message 3 of 9
(11,164 Views)
If you need better resolution than 1ms, you need to go to a "LabVIEW Realtime" system. (You should also make a distinction between "resolution" and "precision". I would not even trust a timing result of several ms on e.g. Windows. Too many other things could be going on).

If you just need to do some timing on a piece of code for benchmarking, put it in a loop, time the total loop execution, then divide by the loop count. If you loop 1000 times, it will get you an apparent resolution of 1 microsecond. The result will still show a significant scatter due to other things taking place. To get more reproducible results, set the benchmarking VI to "time critical priority", run it a few times, then take the fastest.

("Time critical priority", puts most othe
r stuff on the backburner, so this is not recommended for normal tasks. mouse movements will be very sluggish, and in my case I even loose wireless internet connectivity, because the driver gets starved.)
Message 4 of 9
(11,164 Views)
UW Elec wrote:

> Thank you very much for your answer.
> However, I am interested in knowing the time taken by some part of my
> labview code down to 100 microsecond resolution, rather than delay. Is
> it still impossible?

The way this is most often resolved is to actually execute that part for
instance 1000 times in a loop. That can give you some idea to the
magnitued this operation takes in average, although it won't show you
the macimum or minimum time on such a timescale.

Another solution is to use the performance counter of the Pentium CPUs.
It's tricky and there are a few Win32 API calls to do that.
Basically you read the counter before and after your operation to time
then you read the frequency and calculate the time by

(c2 -
c1) / f

Rolf Kalbermatter
Rolf Kalbermatter
My Blog
0 Kudos
Message 5 of 9
(11,164 Views)
> However, I am interested in knowing the time taken by some part of my
> labview code down to 100 microsecond resolution, rather than delay. Is
> it still impossible?

Ah. A common technique listed in the other post is to use repetition so
that a lower resolution timer can be used, then averaged to find the sub
microsecond event.

Another technique is to use the high resolution windows timers as
mentioned in this article

http://sine.ni.com/apps/we/niepd_web_display.DISPLAY_EPD4?p_guid=B45EACE3DE8556A4E034080020E74861&p_node=DZ52018&p_submitted=N&p_rank=&p_answer=&p_source=External

If the link doesn't work, search for high-precision timer or something
similar.

Greg McKaskle
0 Kudos
Message 6 of 9
(11,164 Views)
Thank you so much. This is exactly what I wanted. You are my saviour. 🙂
0 Kudos
Message 7 of 9
(11,164 Views)

Hi,

 

I am using window 7 processor 2.4GHz. labview 10.0.

 

What I try to do is set frequency according to a list (such as 1GHz, 1.3GHz, 5GHz, etc.) after about 2.55ms or smaller such as 340us time delay. Looks like 1ms is the smallest delay I could make. How about less than it? How could I make it?

 

Thanks,

Ott

0 Kudos
Message 8 of 9
(9,303 Views)

@Ott wrote:

Hi,

 

I am using window 7 processor 2.4GHz. labview 10.0.

 

What I try to do is set frequency according to a list (such as 1GHz, 1.3GHz, 5GHz, etc.) after about 2.55ms or smaller such as 340us time delay. Looks like 1ms is the smallest delay I could make. How about less than it? How could I make it?

 

Thanks,

Ott


You do realize this thread is almost 9 years old?  If you have a question, you really should start a new thread.

 

On a Windows PC, the best resoultion you are going to get is the 1ms.  Even then, Windows can take over and you'll suddenly see 15ms of nothing.  It isn't a deterministic operating system.  If you need better determinism, you need to go to a Real Time system or an FPGA.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 9 of 9
(9,291 Views)