LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Get Date/Time In Seconds Function Accuracy?

I have the attached VI coded to get a timestamp from the computer and output it where the fraction of seconds has 6 digits. I was wondering if any one knows how accurate it is because from my understanding, LV timestamps have only millisecond accuracy. Does this mean that the last three digits are gibberish or is that the exact time according to what my computer thinks?

 

Thanks

 

Aaron

0 Kudos
Message 1 of 11
(5,103 Views)

You are a bit confused.

 

The timestamp type is just a funny way of displaying what is esentially a DBL (they wrap the data through a wash machine to show time and date)  So it has at least 10 digits of precision. 

 

What it does not have is accuracy or resolution These are OS dependant since the OS schedules the servicing of the Real Time Clock on Windows its about once a millisecond that it updates (but it counts the ones it misses too- so...it approaches a good clock.

 

If you are looking at mSec and lower you don't look at the OS RTC.  Its like triing to tell seconds from a sundial. 


"Should be" isn't "Is" -Jay
0 Kudos
Message 2 of 11
(5,093 Views)

Jeff Bohrer wrote:

The timestamp type is just a funny way of displaying what is esentially a DBL (they wrap the data through a wash machine to show time and date)  So it has at least 10 digits of precision. 


This was true only up to LabVIEW 7.0, but in newer versions timestamps have 128 bits internally, so converting to DBL potentially loses information.

 

Numerically, timestamps have sub-attosecond resolution. However, there is no conceivable way to call the function predictably with that kind of precision and by the time the indicator is updated a lot of time has passed.

 

 

Message 3 of 11
(5,087 Views)

@altenbach wrote:

Jeff Bohrer wrote:

The timestamp type is just a funny way of displaying what is esentially a DBL (they wrap the data through a wash machine to show time and date)  So it has at least 10 digits of precision. 


This was true only up to LabVIEW 7.0, but in newer versions timestamps have 128 bits internally, so converting to DBL potentially loses information.

 

Numerically, timestamps have sub-attosecond resolution. However, there is no conceivable way to call the function predictably with that kind of precision and by the time the indicator is updated a lot of time has passed.

 

 


My bad- of course .  Accuracy and resolution are still OS dependant.
Thanks again Christian


"Should be" isn't "Is" -Jay
0 Kudos
Message 4 of 11
(5,077 Views)

@Jeff Bohrer wrote:
[...]These are OS dependant since the OS schedules the servicing of the Real Time Clock on Windows its about once a millisecond that it updates (but it counts the ones it misses too- so...it approaches a good clock. [...]

This is also not entirely up-to-date for LV (sorry). Please read this knowledge base for further information.

Please note that the nanosecond engine was introduced with LV 8.0 (as basic for timed structures).

The system RTC is only a reference the nanosecond engine synchronizes to.

 

Technically, it is possible to achieve wait times of less than ms on a Windows system with LV and the nanosecond engine (though you will not achieve ns, despite its name!).

But since Windows is no real time OS, it is broadly accepted/acknowledged, that jitter peaks of >10ms are "common". Therefore, it does not make sense to get easy access to a timing resolution which is "just disappearing in noise", does it?

 

hope this helps,

Norbert

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
Message 5 of 11
(5,044 Views)

Thanks for all of the input. So I think I understand what's happening. Even though the Get Date/Time function is precise to sub millisecond resolution, it receives the time information from the OS clock, which drifts overtime from the RTC. Since the OS clock updates from the RTC every millisecond or so, then we can only assume that the timestamp received is accurate to the millisecond.

 

From what Norbert said and the article linked, it sounds like the timestamp is received from the nanosecond engine which is driven by the RTC. So then would the timestamp be accurate to sub millisecond resolution? I may have misunderstood the article completely though, as I am new to LabVIEW and trying to most of it's features.

 

Thanks

 

Aaron

0 Kudos
Message 6 of 11
(5,018 Views)

What are you actually trying to measure? For example if you would try to use this to measure the execution time of a code fragment, the mere process of gettting the start and end times will already slightly alter the result. Together with the OS sceduling, the result will vary radomly near some typical value and will not be reproducible, ever. 

0 Kudos
Message 7 of 11
(5,010 Views)

I am really trying to measure the start time of a counter measurement. I posted the details in this thread.

 

http://forums.ni.com/t5/Counter-Timer/Triggering-a-counting-vi-on-a-timestamp/td-p/1630752

 

The main goal is to have at best microsecond accuracy, but I'm starting to realize that that may not be possible (and maybe necessary).

 

Thanks

 

Aaron

0 Kudos
Message 8 of 11
(5,006 Views)

@anrran wrote:

 

The main goal is to have at best microsecond accuracy, but I'm starting to realize that that may not be possible (and maybe necessary).




You can use LabVIEW Real Time

=====================
LabVIEW 2012


0 Kudos
Message 9 of 11
(4,983 Views)

Thanks for the suggestion but unfortunately I do not have LV Real Time and I'm in no position to purchase it.

 

Aaron

0 Kudos
Message 10 of 11
(4,930 Views)