LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

interpreting TCP read output string

Solved!
Go to solution

What is the proper method to interpret data from the TCP Read function? The output is a string of characters, which are obviously not hex values. How do I convert the string into a binary array or bytes?

0 Kudos
Message 1 of 13
(5,141 Views)

I don't know.

 

But if you attach what the data looks like in the string of characters, and tell us what you what the data is supposed to say, then we might be able to interpret.

 

You can use "String to Byte Array" to convert the string to an array of bytes.

0 Kudos
Message 2 of 13
(5,141 Views)

How to interpret the data completely depends on what you are communicating with.  If you are dealing with binary data, I find the Unflatten From String to be very useful.  But to get any more specific, we need to know the instrument and/or the messaging protocol it is using.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 3 of 13
(5,138 Views)

I read 1032 bytes. The first 4 bytes should be equal to 5, 6, 7, or 8. The next four bytes should always equal 1024. The remaing 1024 bytes are arbitrary measurement data.

 

I have attached a screenshot of the block diagram and resulting output from the TCP Read function, the conversion using the String to Byte Array function as well as the Type Cast function. None of the outputs seem to make sense, escpecially the ouptut of the TCP Read block.

0 Kudos
Message 4 of 13
(5,127 Views)

@jmountney wrote:

 The next four bytes should always equal 1024.


I don't quite understand that statement.  A byte can only have the value of 0-255.

 

Or are they 4 bytes that if they are typecast together into a U32 would be equal to 1024?

 


jmountney wrote

I read 1032 bytes. The first 4 bytes should be equal to 5, 6, 7, or 8.


What do those 4 bytes mean?

 

 

I think you may need to go back and look at whatever program is putting the data into the database.  Is it another LabVIEW program?  Perhaps what it is putting in isn't quite what you think it is.

0 Kudos
Message 5 of 13
(5,124 Views)

The first 4 bytes (32bits) define a message type. There are 4 types of messages (5, 6, 7 & 8). 

 

The second 4 bytes (32 bits) would equal 1024. This values simply defines the length of the data in the remaining part of the message.

 

 

0 Kudos
Message 6 of 13
(5,119 Views)
Solution
Accepted by topic author jmountney

Ok,wild guess time.....

 

The first four bytes are a clue to SYNC

The next four bytes 0 4 0 0 are a 32 bit integer, least sig byte first. 1024

Then you have 6 0 0 0, which could be another 32 bit integer, lsb first. 6

 

Then your random data

 

Rod.

0 Kudos
Message 7 of 13
(5,102 Views)

Forget what I said about a database.  I was confusing another thread with this one where the person was having very similar problems.

 

Obviously, what you are getting doesn't match what is being sent.  You aren't getting 5 6 7 8.  The first 4 characters spell out SYNC like you see.  The next 4, 0 4 0 0 if it is MSB byte last, does equal 1024.  The next 4 bytes are 6 0 0 0.  Does the number 6 mean anything to you?  Is taht the message type of either 5 6 7 or 8, but it is saved as a 32 bit number?  After that, things start looking like a pattern of data stored in binary.

0 Kudos
Message 8 of 13
(5,097 Views)

Ahhhh................ Thank you. Now it makes sense. I now feel like an idiot. 

0 Kudos
Message 9 of 13
(5,078 Views)

Hi LV-Pros,

 

I'm fighting with something similar. I use the following library to connect to my MySQL server: https://decibel.ni.com/content/docs/DOC-10453

A awesome tool, because my applicaiton needs to run on Mac and Windows (LV nativ tools, no connectivty tool kit required)

 

But, it only works for 128 characters, because LV only get the 128 characters right (no matter if it's the 'TCP read' or the 'string to byte array' function. the byte interpretation higher 127 is also OS dependant (Win has something different than Mac). Is the TCP read output string dependant on ASCII? I believe (sorry, my knowleadge is biotech, I know not too much about programming), that the MySQL DB sends UTF-8 and since ASCII and UTF-8 are the same up to 128, TCP read brings the correct string back. Higher then 128 the Server sends a unsigned word, so u16, as far as i understand the whole thing.

 

Now how can I tell the TCP read function to interpret the data send as utf-8, coded in u8 und u16? I'm totally lost ... 😕

 

Thank you all and sunny greetings from Switzerland

0 Kudos
Message 10 of 13
(4,922 Views)