07-14-2011 03:20 PM
I have the attached VI coded to get a timestamp from the computer and output it where the fraction of seconds has 6 digits. I was wondering if any one knows how accurate it is because from my understanding, LV timestamps have only millisecond accuracy. Does this mean that the last three digits are gibberish or is that the exact time according to what my computer thinks?
Thanks
Aaron
07-14-2011 04:44 PM
You are a bit confused.
The timestamp type is just a funny way of displaying what is esentially a DBL (they wrap the data through a wash machine to show time and date) So it has at least 10 digits of precision.
What it does not have is accuracy or resolution These are OS dependant since the OS schedules the servicing of the Real Time Clock on Windows its about once a millisecond that it updates (but it counts the ones it misses too- so...it approaches a good clock.
If you are looking at mSec and lower you don't look at the OS RTC. Its like triing to tell seconds from a sundial.
07-14-2011 04:59 PM
Jeff Bohrer wrote:The timestamp type is just a funny way of displaying what is esentially a DBL (they wrap the data through a wash machine to show time and date) So it has at least 10 digits of precision.
This was true only up to LabVIEW 7.0, but in newer versions timestamps have 128 bits internally, so converting to DBL potentially loses information.
Numerically, timestamps have sub-attosecond resolution. However, there is no conceivable way to call the function predictably with that kind of precision and by the time the indicator is updated a lot of time has passed.
07-14-2011 07:25 PM - edited 07-14-2011 07:32 PM
@altenbach wrote:
Jeff Bohrer wrote:The timestamp type is just a funny way of displaying what is esentially a DBL (they wrap the data through a wash machine to show time and date) So it has at least 10 digits of precision.
This was true only up to LabVIEW 7.0, but in newer versions timestamps have 128 bits internally, so converting to DBL potentially loses information.
Numerically, timestamps have sub-attosecond resolution. However, there is no conceivable way to call the function predictably with that kind of precision and by the time the indicator is updated a lot of time has passed.
My bad- of course . Accuracy and resolution are still OS dependant.
Thanks again Christian
07-15-2011 08:33 AM
@Jeff Bohrer wrote:[...]These are OS dependant since the OS schedules the servicing of the Real Time Clock on Windows its about once a millisecond that it updates (but it counts the ones it misses too- so...it approaches a good clock. [...]
This is also not entirely up-to-date for LV (sorry). Please read this knowledge base for further information.
Please note that the nanosecond engine was introduced with LV 8.0 (as basic for timed structures).
The system RTC is only a reference the nanosecond engine synchronizes to.
Technically, it is possible to achieve wait times of less than ms on a Windows system with LV and the nanosecond engine (though you will not achieve ns, despite its name!).
But since Windows is no real time OS, it is broadly accepted/acknowledged, that jitter peaks of >10ms are "common". Therefore, it does not make sense to get easy access to a timing resolution which is "just disappearing in noise", does it?
hope this helps,
Norbert
07-15-2011 10:16 AM
Thanks for all of the input. So I think I understand what's happening. Even though the Get Date/Time function is precise to sub millisecond resolution, it receives the time information from the OS clock, which drifts overtime from the RTC. Since the OS clock updates from the RTC every millisecond or so, then we can only assume that the timestamp received is accurate to the millisecond.
From what Norbert said and the article linked, it sounds like the timestamp is received from the nanosecond engine which is driven by the RTC. So then would the timestamp be accurate to sub millisecond resolution? I may have misunderstood the article completely though, as I am new to LabVIEW and trying to most of it's features.
Thanks
Aaron
07-15-2011 10:33 AM
What are you actually trying to measure? For example if you would try to use this to measure the execution time of a code fragment, the mere process of gettting the start and end times will already slightly alter the result. Together with the OS sceduling, the result will vary radomly near some typical value and will not be reproducible, ever.
07-15-2011 10:45 AM
I am really trying to measure the start time of a counter measurement. I posted the details in this thread.
http://forums.ni.com/t5/Counter-Timer/Triggering-a-counting-vi-on-a-timestamp/td-p/1630752
The main goal is to have at best microsecond accuracy, but I'm starting to realize that that may not be possible (and maybe necessary).
Thanks
Aaron
07-15-2011 01:56 PM
@anrran wrote:
The main goal is to have at best microsecond accuracy, but I'm starting to realize that that may not be possible (and maybe necessary).
You can use LabVIEW Real Time
07-18-2011 02:06 PM
Thanks for the suggestion but unfortunately I do not have LV Real Time and I'm in no position to purchase it.
Aaron