06-22-2011 09:53 AM
Hi,
I have one question relation this topic. As you told me, in labview 2009 I put an indicator to see the value, as you recommend me, but when I opened the same file in Labview 2010 didn't work. In the indicator the value lose the digits precision and I didn't change anything.
Can someone tell me another way to see what's happen? I need some extra library to floating points or something like this?
Thanks!!!
(Sorry for asking the same topic again, but I didn't change aything...)
06-22-2011 09:57 AM
Hi,
I would like to add this notice:
Due to changes to the LabVIEW compiler, the results of several mathematical operations performed using floating-point numbers might differ from results returned in previous versions of LabVIEW. The accuracy of algorithms written in LabVIEW using floating-point numbers is the same and in many cases improved in LabVIEW 2010. However, in a few operations the results might be less accurate than in previous versions because LabVIEW 2010 implements functions internally with the same numeric precision as the input data types rather than a higher numeric precision than the input data types as in previous versions. The acceptable error for the results of these operations is still appropriate for the data types of the inputs.
Thanks for yur feedbacks!
06-22-2011 10:00 AM
mlop wrote:I have one question relation this topic. As you told me, in labview 2009 I put an indicator to see the value, as you recommend me, but when I opened the same file in Labview 2010 didn't work. In the indicator the value lose the digits precision and I didn't change anything.
Can you attach the 2009 VI (or a simplified version containing only the indicator). Have you tried to change the format again?
@mlop wrote:
Can someone tell me another way to see what's happen? I need some extra library to floating points or something like this?
As I said, attach the 2009 VI and tell us exactly what you see and what you expect to see in 2010.
06-22-2011 10:01 AM
Opening in 2010 should make no difference. Did you actually look at the Display Format? Does it really matter since the indicator was just to prove to yourself that the Scan From String is working perfectly fine? You are really asking about a problem that does not exist.
06-22-2011 10:04 AM
@mlop wrote:
Hi,
I would like to add this notice:
Due to changes to the LabVIEW compiler, the results of several mathematical operations performed using floating-point numbers might differ from results returned in previous versions of LabVIEW...
This is completely irrelevant to your formatting problem. Here we are talking about changes in the result of complicated operations where the difference is extremely minor and probably in the 15th decimal digit or smaller. You are reading a string and converting to DBL. This is lossless within the inherent limitations of the DBL representation.
06-22-2011 10:08 AM
Attached you will find!
I know tha has no sense asking this, but I can't see the value in 2010. However when I prove in other PC with 2009 with no problems (thanks of yours responses).
Thanks again!
06-22-2011 10:15 AM
Do you see the full value when you place an indicator on the string before scanning to DBL?
06-22-2011 10:18 AM
Hi,
Yes! the value received by VISA I see it perfectly (with all digits precisions) but, after the block "Scan Value" not..
The indicator output 1 shows without them.
Thanks!
06-22-2011 10:26 AM
I see the full value in 2010.
06-22-2011 10:37 AM
You have the chart Y axis formatting set to "Automatic"