12-12-2023 08:05 AM
Hi,
I am very new to LabVIEW and programming
I am trying to implement a millisecond timer using two tick clock(ms) functions. The timer must run at the beginning and it should pause on a rising edge of the digital signal. Although I am able to achieve this, the is some problem with the accuray of the timer, as it shows atleast 2ms lesser than the actual value. Moreover, sometimes (especially in the beginning) it also shows some awkward values like 1ms. Kindly help me to know, if there is a better way to implement the timer or how can I solve the inaccuracy.
Solved! Go to Solution.
12-12-2023 11:07 AM - edited 12-12-2023 11:08 AM
Hi gok,
@gok010 wrote:
Kindly help me to know, if there is a better way to implement the timer or how can I solve the inaccuracy.
You forgot to attach the code.
All we got is an image: we cannot edit/run/debug images in LabVIEW!
Comments/Improvements:
12-12-2023 11:18 AM
In addition to Gerd's notes on the code, just in general there's a flaw in your plan and that is the fact that you are running on a non-real-time operating system (presumably Windows).
Any operating system that isn't real-time can and will stop running your program for a few milliseconds from time to time. Even if you fix all the potential errors in the code itself, Windows might suddenly decide that there's something else more important than LabVIEW to do for 10 ms, or 50 ms, or even 1 second.
The solution is that you have to move the timing to the DAQ itself, which can capture and timestamp independently of Windows. Right now you are capturing the "instant" state of the DAQ at whatever time it gets to the "read" function. If you change to a read function that instead captures continuously 1000 times a second (or more) into a hardware buffer, then each time you read from it you will instead get multiple time-stamped readings that you could then go back and analyze to actually get the measurement you're trying to make.
12-12-2023 11:26 AM
In addition to what has been said, please clarify what you are trying to do:
Please attach your entire code and explain exactly how it should be used. For simplicity, change the code to a simulation by substituting the digital reads with a boolean array control and eliminating all DAQ. This way it is much easier to work out the logic!
12-12-2023 11:58 AM
12-13-2023 03:56 AM
Hello Altenbach,
Thank you for replying to my post, though your code works well up to measuring 30ms, it is inaccurate in measuring less than 30ms
12-13-2023 03:59 AM
Hi gok,
@gok010 wrote:
your code works well up to measuring 30ms, it is inaccurate in measuring less than 30ms
In which way is it inaccurate? Can you provide some numbers/examples?
The limit/accuracy to measure is ~2ms because of that 1ms wait inside the loop. (Without taking into account the influences introduced by the OS.)
12-13-2023 04:10 AM
Hi,
I have tried to acquire in hardware-timed single-point mode with a sample rate of 1000, but it still fails to measure values accurately for times less than 30ms.
12-13-2023 04:11 AM
yes, for example, when the time of the rising edge is less than 30ms the timer shows a value of zero
12-13-2023 05:54 AM - edited 12-13-2023 05:57 AM
Hi gok,
@gok010 wrote:
yes, for example, when the time of the rising edge is less than 30ms the timer shows a value of zero
Why do you think this is a problem of Altenbach's code?
@gok010 wrote:
I have tried to acquire in hardware-timed single-point mode with a sample rate of 1000, but it still fails to measure values accurately for times less than 30ms.
Does your hardware allow for such sample rates in single-point mode?
Does your hardware allow for continuous measurements using hardware timing? (Why don't you use this when possible?)