LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Byte Count At VISA Read.

I am trying to learn Instrument Control. How do I use VISA Read. Because I dont know how to know what value that must be given to the Byte Count at VISA Read. If we dont know the number of Bytes to read then what value should we give? Thank You. 

0 Kudos
Message 1 of 9
(7,783 Views)

Hi govindsankar,

 


@govindsankar wrote:

If we dont know the number of Bytes to read then what value should we give?


When you don't know the number of bytes in the expected message then the message should end with a TermChar: configure this TermChar at VISASerialPortInit! Now you request a number of bytes (much) larger than a reasonable expected maximum message length…

 

When there is no TermChar to end a message then your communication protocol (should) work with fixed-size messages: just request the expected number of bytes…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 2 of 9
(7,772 Views)

That answer demands on what kind of communication protocol you are using with whatever device you are talking to.

 

The most commonly seen implementation in LabVIEW examples is the number of bytes at the port is read with a property node node and wired into that VISA read terminal.  That also happens to be the wrong thing to do 99% of the time.

 

If the device is using a communication protocol that is basically human-readable ASCII characters and ends the message with a carriage return or linefeed, very common in serial communication, then you configure the port with the termination character enabled and whichever character that is (13dec 0Dhex for CR, 10dec or 0Ahex for LF).  Then your wire in a number of bytes that is larger than the longest message you ever expect to get.  The VISA Read will terminate and return what it has as soon as it sees the termination character.

 

If the protocol is more binary when any of the 256 possible bytes are a part of the message, then you need to disable the termination character.  You will need to read up more on the protocol to determine how many to read.  Often there is a byte or two early in the message that will tell how many more bytes there are to read after that.

Message 3 of 9
(7,771 Views)

@govindsankar wrote:

I am trying to learn Instrument Control. How do I use VISA Read. Because I dont know how to know what value that must be given to the Byte Count at VISA Read. If we dont know the number of Bytes to read then what value should we give? Thank You. 


99% of instrument communications is figuring out the protocol (message format, command structure, etc.).  If you don't know the protocol, then you can't talk to the instrument.  It would be like talking Japanese to a South African.  So unless you can tell us the protocol, there is not really much we can do to help.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 4 of 9
(7,766 Views)

I am using GPIB 

0 Kudos
Message 5 of 9
(7,756 Views)

Hi govindsankar,

 


@govindsankar wrote:

I am using GPIB 


The used hardware (GPIB or RS232 or whatever) doesn't matter, it's the protocal that matters!

 

Which protocol does your DAQ device use/support? It surely mentions all details in its manual…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 6 of 9
(7,753 Views)

@govindsankar wrote:

I am using GPIB 


Ok, let me rephrase some things here.  GPIB, RS-232, TCP/IP, etc. describe what my current world states is the "Low-Layer Protocol".  That is the level at which raw data is transferred.  The joy of VISA is that it abstracts the Low-Layer so you don't have to worry about it.

 

What I stated you need is described as a "High-Layer Protocol" or Syntax.  This is the format of the data coming across the Low-Layer.  For an example, SCPI is a High-Layer protocol standardization because it describes the format of the data coming across whatever bus (initially designed for GPIB, but also commonly used over RS-232, USB, and Ethernet).


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 7 of 9
(7,739 Views)

Another way to think about it:

 

High-level protocol: The language you're writing (English, Japanese, Spanish)

Low-level protocol: How you send the message (Email, hand-written letter, fax)

 

The low-level protocol doesn't matter too much. Your question could be rephrased "How many words should I read in this email my friend sent me?"

 

The answer to that would be "Read until you encounter his email signature, that indicates the end of an email." The same is true with a termchar. If the instrument with which you are communicating features a termchar, use that and set Bytes To Read at 1000 (or something very high). VISA Read will return bytes from the port (serial, GPIB, doesn't matter) until one of these things happens:

-You've read the correct number of bytes

-Timeout reached

-Termchar received

 

99% of the time, you want to read until you hit a termchar, as that indicates the end of the message.

 

To use the email metaphor, you're basically saying "Read this email until you either: Read 1000 words, Have been reading for more than 5 minutes, OR you see an email signature."

Post your device communication manual and we can further help you figure out the decoding scheme, but like it's been stated above, 90% of the work in creating an instrument driver is figuring out the communication protocol.

Message 8 of 9
(7,689 Views)

If you are lucky enough to have to work with a device that doesn't have a very good protocol than there is another method you can use to read your data. This is usually very reliable in a command/response type of communication. It can be much more difficult if the device will be sending asynchronous messages. The method I have used with great success is to have your read broken into two separate reads. The first read will look for a single character. You will use a reasonable timeout here for this read. The second read will occur in a loop and will read blocks of data. How large depends on the size of the data you will be expected. This read in the loop will use a very short timeout. I generally use about 50 ms. The loop exits on a timeout error. You can ignore the timeout error since you are actually expecting it. Pass any other errors that you may encounter through. The logic behind this approach is that the device will send its data out in a continuous stream. When there is a moment of silence on the interface the assumption is that there is no more data. You may have to adjust your short timeout value depending on your device. Also, when using a RS-232 connection I add in the minimum time required for the data transmission based on the BAUD rate. In a command/response situation this works very well. If the device is sending asynchronous messages, things get a bit more complex.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
Message 9 of 9
(7,644 Views)