LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Help converting hexadecimal to ASCII

Solved!
Go to solution

Hi!

 

I have somewhat successfully converted hexadecimal to ASCII however the final display would add unnecessary spaces in front of the word.

 

Is there another way to convert hexadecimal to ASCII? 

0 Kudos
Message 1 of 14
(3,103 Views)

Can you "save for previous" (and select LabVIEW 2020 or lower) and attach the down-converted version. I cannot see your VI.

 

The question itself is not clear at all, because ASCII is just a convention to assign characters and control codes to bit patterns, i.e. how they are displayed or what they mean, e.g. linefeed. There is no "conversion" per se. Similarly "hexadecimal" is also very ambiguous. Is it a binary string? A formatted string containing ASCII characters 0..F exclusively? Something else?

 

 

0 Kudos
Message 2 of 14
(3,065 Views)
Solution
Accepted by topic author Gared898

You need to be splitting up into every 2 characters, turning those into a byte, building an array of those values, and then you can convert that to a string. Instead you're generating an I64 but you're only populating some of the bytes of it with the upper 3 bytes still being 0s. When trying to convert numbers to strings you do not want to be touching type cast.

 

IlluminatedG_0-1688763701957.png

 

~ The wizard formerly known as DerrickB ~
Gradatim Ferociter
0 Kudos
Message 3 of 14
(3,063 Views)

The problem is your 64 bit integer has a bunch of leading zeros that get turned into spaces.

 

You need to convert the hex to ASCII one letter at a time and use an 8 bit integer for the Hex string to Number conversion

 

It's a little Rube Goldberg but this is what I came up with.

 

Screenshot 2023-07-07 144649.png

EDIT: A little less Rube Goldberg 

========================
=== Engineer Ambiguously ===
========================
0 Kudos
Message 4 of 14
(3,050 Views)

snip.png

0 Kudos
Message 5 of 14
(3,029 Views)

sometimes it helps to search the forum first:

 

 

HexStringtoBinaryString.png

0 Kudos
Message 6 of 14
(2,991 Views)

If the input string is guaranteed to be clean (even number of characters, only 0..F), I might use a map constant, especially if the string is very long (error handling can be added if a key is not found):

 

altenbach_0-1688781107821.png

 

0 Kudos
Message 7 of 14
(2,981 Views)

Fun fact: Byte Array to String is a no-op unless the resulting string or source array gets modified further. Type Cast goes through a serialization/deserialization and therefore a memory copy operation. For small sizes and infrequent operations... who cares, but type cast can add unnecessary overhead for repeated/larger operations.

IlluminatedG_0-1688793864724.png

IlluminatedG_1-1688794188468.png

 

 

~ The wizard formerly known as DerrickB ~
Gradatim Ferociter
Message 8 of 14
(2,959 Views)

You would think that if we are dealing with an U8 array, the compiler would generate identical code... 😄

0 Kudos
Message 9 of 14
(2,950 Views)

@altenbach wrote:

You would think that if we are dealing with an U8 array, the compiler would generate identical code... 😄


In theory but in practice most nodes are more or less direct calls to a precompiled C function in the LabVIEW kernel save from some optimizations done by the compiler for unused parameters or alternative code paths. So if you place a Typecast on the diagram, the according Typecast C function is invoked which is prepared to deal with arbitrary (flat) data. The String to Byte Array function and its inverse sibling are a bit special as there is no C function needed behind it. It simply changes the diagram datatype  and has flags telling the compiler that it does not modify the data in any way itself. For the rest it is a real NOP especially at runtime.

Rolf Kalbermatter
My Blog
Message 10 of 14
(2,873 Views)