LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

visa inter byte time

Hi everybody!
 
I am working on an application that uses VISA to communicate on serial port. This is on a regular PC, with Win XP, using COM1, under Labview 8.
 
Now, here are the challenges. I tried to solve these, but they seem a little too much to handle, so I'm asking here as a last resort. Well, it's not a matter of life or death, but a solution would be GRRRREAT for my application. Smiley Happy
 
Long things short, here are the things I am trying to do:
 
  1. Manipulate inter-byte time when sending messages over serial.
  2. Detect inter-byte time when receiving messages
  3. Count the time between the end of a transmission and the beginning of a reception.

Any help on any of these would be greatly appreciated. Thanks and have a great day! 😉

 
 
 
0 Kudos
Message 1 of 4
(3,761 Views)
The inter-byte time can be controlled by the attached Serial Write Slow and the you might be able to come close to measuring the response time with the attached Measure Serial Response. I'm not sure about measuring the received inter-byte time in LabVIEW. This might require an instrument connected externally to the pc's uart. Maybe someone else can come up with an idea for this.

Message Edited by Dennis Knutson on 11-08-2006 10:43 AM

Download All
Message 2 of 4
(3,747 Views)
Thank you very much for your response! Both vi's have given me very good ideas. It turns out that my colleagues use an oscilloscope to determine the inter-byte time in the response, so that's solved.
 
I have a couple of questions regarding your solutions:
  1. what is the minimum reliable time you estimate I can use as delay in "serial write slow.vi" with labview under Win XP? Do you know if I woould get much better results if using RealTime Labview? MUCH better (could you tell me the order of magnitude)? And would I use the same functions for creating delays in RealTime Labview?
  2. It seems to me like the second vi would actually measure the time until the whole response has been received in the buffer. This is because the second recording of time is done after using the "bytes at port" property. Am I wrong?

Message Edited by louis_nichols on 11-09-2006 01:23 AM

0 Kudos
Message 3 of 4
(3,735 Views)

I don't think there is a minimum reliable time when using windows. The msec wait gives you 1 msec resolution but at any time the OS can jump in between an iteration of the loop or between functions and do whatever it feels necessary to do. You can experiment with the priority that the VI runs at but if you need absolute control, then LabVIEW real-time is the way to go.

If you noticed, the loop will exit as soon as the byte count is greater than 0 so the tick count difference is probably going to be farily small whether you take the count before or after. You could also wire the two functions in parallel. I put the tick count inside a sequence structure so that you did have the flexibility to move it were you wanted. You might want to use that scope for inter-byte delay to see what an actual response delay is and compare it with the software measurement technique.

0 Kudos
Message 4 of 4
(3,725 Views)