06-15-2011 11:38 AM - edited 06-15-2011 11:39 AM
Hi,
I have a very simple vi that measures voltage and converts it into a text string that is sent at 9600 baud to a motor controller through a serial port (actually a USB port with a USB/RS485 converter). I am using NI 9211 for the voltage measurement. My understanding is that the 9211 can sample up to 14/s for all 4 channels. I am using only one channel, so is the sampling rate 14, or is it 14/4 per second? Voltage input is from a strain gauge.
Suppose it is the lower rate, 14/4 or about 3 per second, I am still observing a very slow rate of command being sent to the motor controller. The default values of RATE and SAMPLES PER CHANNEL PER SECOND are both 1000. Does this make sense given that the 9211 has an overall rate of 14S/second? What should these values be correctly set to?
What is the latency time between the calculation and before a command is sent out. The third party motor controller vendor says that their motor controller is capable of sending/receiving 25 such string commands per second which means that the Labview vi should be able to send about 12 commands per second (waiting for the other 12-13 to be returned from the encoder on the motor.) The motor controller appears to be much faster than the 9211, but, what I observe is that it takes well over 1 second for a new command to be sent. How do I understand the timing of my vi, and maximize the rate of command output to the motor controller?
Thanks,
Dave
06-16-2011 03:08 PM
Hey dav2010,
In this scenario, it appears that the ratio between RATE and SAMPLES PER CHANNEL is the issue. Since they are the same, the ratio between them is 1, meaning that it will take one second.
For instance, if you need to measure 1000 samples, and you are measuring at a rate of 1000 samples per second, then 1000/1000 = 1 second. Now if you changed the SAMPLES PER CHANNEL to 100, then the ratio would be 100/1000 = 0.1 seconds.
There is a good example of how this works here.