07-19-2006 07:48 AM
07-19-2006 08:09 AM
07-19-2006 08:25 AM
Thanks for the try, but I am still in the dark ages of 7.1.
I should probably mention that on all future posts.
The way that I do it right now is just use the string conversion of decimal string to number for just one character.
Is that the best way?
Perhaps it is my old "cycle counting dinosaur" mentality, but that just seems inefficient for a single byte.
Thanks again.
07-19-2006 08:29 AM - edited 07-19-2006 08:29 AM
Message Edited by becktho on 07-19-2006 03:30 PM
07-19-2006 08:56 AM
When I run the cast with a string constant of "5", the output is 53.
When I run a string conversion, I get 5.
They both consistently take about 120 ms for 1,000,000 conversions.
Any idea why the incorrect value?
07-19-2006 09:01 AM
07-19-2006 09:01 AM
07-19-2006 09:03 AM