06-16-2023 11:02 AM
Hi Sir,
In short, so use Type Cast, convert the number to binary, it is much faster and efficient during data transportation, right ?
Thanks.
06-16-2023 10:20 PM
It's not limited to "numbers" and there is no "conversion". The bits remain untouched, so there is little overhead.
Formatting a numeric into readable decimal strings and back is expensive in comparison, and since we typically need more bytes, transportation is proportionally harder too.
06-17-2023 05:28 AM - edited 06-17-2023 05:28 AM
Typecast is faster thsn conversion but unlike in C and other similar languages not blinding fast.
in C a typecast is almost exclusively a compile time effort and just tells the compiler that the data is now a different time. In LabVIEW a string is created or copied from normally. wich is a significant effort (but much less than trying to parse a string or formatting one. Even if you compsre from numbers into other numeric types does LabVIEW execute code to make sure the memory access is not corrupting the buffers on both sides of the node. So the LabVIEW Typecast has a runtime footprint, in C it only indicates to the compiler how to (re)interpret the data.