LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Why don't I get about 1 ms resolution from timing VIs in LabVIEW?

Hi,
 
Previously, the resolution of "Wait (ms)" and "Tick (ms)" was about 2 ms, and that of "Timed Loop" was better than 1 ms in LabVIEW 8.5 in my Core2 Duo 1.7 GHz laptop PC with Windows XP.
Later on, due to a problem I formatted my hard disk and restored the system to factory installation and installed LabVIEW 8.5 Professional Development System again, and found out that the resolution of Wait (ms) and "timed loop" is about 15 ms, respectively. I wonder how I can get the same timer resolution as I used to get in my PC before formatting?
A book said that a PC normally uses 60 Hz clock, that is why the timer resolution is about 16.6 ms; if we want the PC to use 1 kHz clock, we will have to install some software multimedia extensions (e.g. QuickTime extensions for Apple computers). But I don't know exactly what are the equivalent multimedia xtension files for Windows XP and where to get those from?
Can you please help me improve the resolution of the timing VIs and "timed loop"?
 
Dr. Javed Ahmed
PhD - Electrical (Telecom) Engineering
Pakistan

0 Kudos
Message 1 of 2
(2,805 Views)

The answer to your question is surprisingly complex, and I probably will not do it justice, but can at least point you in the right direction.

 

Most desktop multitasking operating systems have a "slice time".  This is the smallest amount of time the operating system will give to a task or thread before it switches to another task or thread.  So if you have a single processor core and no hyperthreading (one physical thread), and are running three active operating system threads, the operating system will switch between the three threads with an interval equal to the slice time.  The default slice time for Windows XP is about 15ms, which corresponds pretty well to the resolution you are seeing.  You can change the default slice time, although I cannot remember how (a net search will probably find this for you pretty quickly).  This may effect your processor efficiency, however, since the processor now has to do a lot more thread switching.

 

In addition, you will see different behavior from the Tick Count (ms) primitive and the Get Date/Time in Second primitive.  The former usually has better performance.

 

Buried in the settings for Windows XP, there is an "optimize for background services" or something similar.  LabVIEW works best when this is enabled.  The default for XP is "optimize for foreground".  This may fix your problem.

 

If you want higher resolution, you can call into the performance timers of your processor using the Windows KERNEL32.DLL.  There are some VIs to do this posted here, along with further discussion on timing.  Note that higher resolution does not get around the slice time issue.

 

If you need further information, let us know.

0 Kudos
Message 2 of 2
(2,787 Views)