01-29-2021 10:55 AM
I currently use the Modbus I/O server as part of DSC and Real-Time Modules to create Modbus Slave I/O servers to talk to PLC's as part of a Simulator/Stimulator Device.
The problem is I'm using LabVIEW to simulate any where from 20-500 devices, depending on the chassis, each device with there own I/O Slave Server across multiple RS-485 Comm Ports (generally 10-12 devices per comm port and 32 comm ports per chassis). What I'm starting to notice is the LabVIEW I/O are starting to not be able to keep up with the requests from the PLC's.
We are running at a slow 9600 BAUD. with 1 stop bit, 8 data bits and a parity of odd (most of the time) in RTU mode. I've noticed if I'm only running 1 comm port it seems to handle the communication just fine but as I start increasing the amount running it starts to increase the amount of times the I/O server misses a message. which is causing problems with testing of our equipment.
I'm wondering if anyone knows how to give more priority to the I/O servers.
I've tried increasing the priority in Task Manger of LabVIEW, but that hasn't made any difference. it's never more then 50% CPU Usage and it's running above it's designated clock speed so i don't think there's any thermal throttling issues. Memory is only at 15% usage and i use very little of my Hard Drive so i don't think replacing the drives with a solid state will fix it.
Does anyone know the name of the process that the Modbus I/O servers run on?
01-29-2021 02:19 PM
It would be impossible to give you any input without a better idea of what the code architecture looks like. For wll we know you have everything in one big stacked sequence frame. We wold really need to see the code (or at least a good representation of the framework and architecture) before we could make any suggestions on how to improve it.
First question though, can you change the BAUD rate? 9600 is pretty slow given the number of devices you are trying to simulate.