11-08-2006 07:23 AM
Any help on any of these would be greatly appreciated. Thanks and have a great day! 😉
11-08-2006 11:43 AM - edited 11-08-2006 11:43 AM
Message Edited by Dennis Knutson on 11-08-2006 10:43 AM
11-09-2006 01:21 AM - edited 11-09-2006 01:21 AM
Message Edited by louis_nichols on 11-09-2006 01:23 AM
11-09-2006 07:31 AM
I don't think there is a minimum reliable time when using windows. The msec wait gives you 1 msec resolution but at any time the OS can jump in between an iteration of the loop or between functions and do whatever it feels necessary to do. You can experiment with the priority that the VI runs at but if you need absolute control, then LabVIEW real-time is the way to go.
If you noticed, the loop will exit as soon as the byte count is greater than 0 so the tick count difference is probably going to be farily small whether you take the count before or after. You could also wire the two functions in parallel. I put the tick count inside a sequence structure so that you did have the flexibility to move it were you wanted. You might want to use that scope for inter-byte delay to see what an actual response delay is and compare it with the software measurement technique.