08-03-2020 05:04 PM - edited 08-03-2020 05:06 PM
Attached is some code that I believe is a bug in configuring the XNet interface when using CAN FD. I've included a CAN FD database, and a standard one to test with. Each has just 2 signals in it. My code will initialize a CAN port, creating the frame sessions, and clusters I want to use. Here I configure the interface with the settings it will need. Then I start a Signal In Single-Point session for reading signals. I do this in separate steps because this is part of a larger class API.
When using an interface already open, you must configure the database to use the same settings as the ones the interface is already using. Otherwise you get an error -1074384742, stating that you started an interface with one set of settings, then used another.
So my code configures the database to have the same settings as the interface, and then we can use that database for our new session. All of this works fine and great on high speed CAN 2.0. And all of this works fine and great on CAN-FD as long as I only am using one interface. But if I try to do this with 2 interfaces, the second one generates the error stating it was configured with the wrong settings.
Is this a bug? I admit that I don't fully know the in memory techniques, and database session life times, so I might be doing something wrong. Can someone from NI run the Main VI in the attached zip on two XNet interfaces and tell me what I'm doing wrong? Thank?
EDIT: I've seen this on LabVIEW 2018 SP1, with XNet 20.0, Windows 10. Hardware was a USB-8502, and has the same behavior on a c-series running Embedded Windows, also LabVIEW 2018, XNet 18.
Unofficial Forum Rules and Guidelines
Get going with G! - LabVIEW Wiki.
17 Part Blog on Automotive CAN bus. - Hooovahh - LabVIEW Overlord
08-04-2020 12:04 PM
If anyone from NI stumbles on this I did call to open a service request on this.
Unofficial Forum Rules and Guidelines
Get going with G! - LabVIEW Wiki.
17 Part Blog on Automotive CAN bus. - Hooovahh - LabVIEW Overlord