09-15-2011 03:40 PM
Refer to my attachment
For format into string, the precision comes first and the width (includes the decimal point, right?) comes second.
For scan from string, it seems to be opposite, width comes first and precision second.
Is what I am saying correct? Why is it like this?
Solved! Go to Solution.
09-15-2011 06:54 PM
If you right-click on the scan from string or format to string and choose Edit Scan String or Edit Format String (or read the help), you will see there is a difference.
In Scan From String, digits of precision is not used. The width number has different meanings between the functions as well. In Format to String, it is the mimimum field width. In Scan From String, is is the width. So, in your Scan from String, %5.4f is equivalent ot %5f, which will only take the first 5 characters. In you Format to String, %5.4f means the minimum field width is 5 and you want 4 digits. You can see you actually got 7 characters out, because it needed more than 5.
09-16-2011 09:29 AM
oh, that's clear. Thanks!