08-30-2006 09:32 AM
09-11-2014 06:52 AM
Will.D a écrit :
Wow... 5 stars to all who provided insight into this.
Always nice to learn these little tidbits of computer history. Thanks guys.
Indeed
We were wondering why was LabVIEW using this reference. And now, we have a full answer.
Nice story
09-11-2014 08:05 AM
Normally I would chastise a user for resurecting a 8 year old thread. But I'm so glad you did, there are some interesting reads in this thread that I would have otherwise missed. Thanks.
Unofficial Forum Rules and Guidelines
Get going with G! - LabVIEW Wiki.
17 Part Blog on Automotive CAN bus. - Hooovahh - LabVIEW Overlord
09-12-2014 01:57 AM
@Hooovahh wrote:
Normally I would chastise a user for resurecting a 8 year old thread. But I'm so glad you did, there are some interesting reads in this thread that I would have otherwise missed. Thanks.
So, necromancy can be used for good. 🙂
Great read!
/Y
09-12-2014 06:16 AM - edited 09-12-2014 06:19 AM
@jasonhill wrote:
Earlier than that. The 32-bit date will overflow sometime around 2040. Then we can party like its 1904!
Just so nobody does get a wrong idea. This was when LabVIEW used a U32 as timestamp value since MacOS used that too. They did however change that to a double precision floating point number quite early on, which doesn't have that limit anymore. A double precision floating point has an almost indefinite upper range for this, although the precision will get less accurate, eventually being less precise than one second at around 2^53 seconds from January 1, 1904, which is some 250'000'000 years from now.
But in LabVIEW 7 they decided that this was not enough and introduced the new timestamp datatype. That one can address +-2^63 seconds from 1904 and that is about +-250 billion years and has a theoretical resolution of 1/(2^32) seconds. It has 64 bits for the fractional part but the lower 32 bits are not used as far as I can tell.
09-16-2014 10:52 AM
In addition to Rolfs explaination here's a LabVIEW Timestamp white paper