LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

how to display char code properly?

Solved!
Go to solution

 

Hi all,

  I am trying to convert the decimal numebr (0 to 127) to a char (ASCII code). I found the code here http://digital.ni.com/public.nsf/allkb/894CF5FE064971BF8625758400014993

 

Untitled_1p.png

 

 

I try with several numbers but it doesn't shown the same character as shown in http://digital.ni.com/public.nsf/allkb/475C4E4F3A638AA58625795F007EC531?OpenDocument

 

For example, I tried 104, it shows "@ " but it should be "h". I tried 33, it shows "@" but it should be "!" (no quotation). I wonder what's the right way to do the converstion or is it the wrong way to display it?


 

By the way, if I have an array for decimal numbers, besides using loop and code like above to convert all of them to ASCII one by one, is that any faster way to do the conversion? Thanks.

0 Kudos
Message 1 of 8
(3,188 Views)

@PKIM wrote:

 

Hi all,

  I am trying to convert the decimal numebr (0 to 127) to a char (ASCII code). I found the code here http://digital.ni.com/public.nsf/allkb/894CF5FE064971BF8625758400014993

 

Untitled_1p.png

 

 

I try with several numbers but it doesn't shown the same character as shown in http://digital.ni.com/public.nsf/allkb/475C4E4F3A638AA58625795F007EC531?OpenDocument

 

For example, I tried 104, it shows "@ " but it should be "h". I tried 33, it shows "@" but it should be "!" (no quotation). I wonder what's the right way to do the converstion or is it the wrong way to display it?


 

By the way, if I have an array for decimal numbers, besides using loop and code like above to convert all of them to ASCII one by one, is that any faster way to do the conversion? Thanks.


I think I find the reason for the first question. I have to convert the integer to a byte before casting to char

0 Kudos
Message 2 of 8
(3,180 Views)

PKIM,

 

Typecast is not the function to use for what you want to do. It forces the binary representation of the data to be re-interpreted as a different datatype. The Extended datatype is particularly problematic as it uses different binary representations on different platforms.

 

1. You should use a binary datatype, specifically U8, for numbers which will be converted to characters in a string.

2. If the input is U8, typecast to string actually works.

3. For an array (of U8) use the Byte Array to String function.

4. To convert a string to an array of U8 use the complemantary String to Byte Array function. 

5. These functions may be found in Nuumeric >> Conversion or String >> Path/Array/String Conversion palettes/

 

Lynn

 

number to string.png

0 Kudos
Message 3 of 8
(3,178 Views)

@johnsold wrote:

PKIM,

 

Typecast is not the function to use for what you want to do. It forces the binary representation of the data to be re-interpreted as a different datatype. The Extended datatype is particularly problematic as it uses different binary representations on different platforms.

 

1. You should use a binary datatype, specifically U8, for numbers which will be converted to characters in a string.

2. If the input is U8, typecast to string actually works.

3. For an array (of U8) use the Byte Array to String function.

4. To convert a string to an array of U8 use the complemantary String to Byte Array function. 

5. These functions may be found in Nuumeric >> Conversion or String >> Path/Array/String Conversion palettes/

 

Lynn

 

number to string.png


Thanks johnsold, it does help. I have a question about the data conversion. If the array of interger is a word (two-byte integer) and I would like to convert each byte of the word to ASCII. How can I do that? I mean how can I extract the first 8 bits and bit 8 to bit 15 from a word in labview? It is pretty easy to do so in C/C++ but I don't know how to do that in labview.@johnsold

0 Kudos
Message 4 of 8
(3,175 Views)
Solution
Accepted by topic author PKIM

Use the Split Number primitive from the Numeric >> Data Manipulation palette.  If your number is a signed integer, you need to be careful about the sign bit and negative values although those are not relevant for ASCII characters.

 

Lynn

 

Split number.png

0 Kudos
Message 5 of 8
(3,172 Views)

@johnsold wrote:

Use the Split Number primitive from the Numeric >> Data Manipulation palette.  If your number is a signed integer, you need to be careful about the sign bit and negative values although those are not relevant for ASCII characters.

 

Lynn

 

Split number.png


cool. Thanks a lot.

0 Kudos
Message 6 of 8
(3,167 Views)

Then mark Lynn's message as the solution to your question rather than your own thank you message.  First you'll need to go to the options menu to the upper right of your message and unmark it as the solution.

0 Kudos
Message 7 of 8
(3,158 Views)

@RavensFan wrote:

Then mark Lynn's message as the solution to your question rather than your own thank you message.  First you'll need to go to the options menu to the upper right of your message and unmark it as the solution.


Sorry for that. My bad ... I am taking his reply as solution but accidently mark my reply 😞

0 Kudos
Message 8 of 8
(3,134 Views)