10-29-2014 02:16 AM - edited 10-29-2014 02:20 AM
Hi,
I am doing some serial communication in LabVIEW and have battled for a long time with parity error where it appeared as a lot of 0's were added to the data I read through VISA read. I fixed my problem according to this description:
Can I Do 9-bit Serial Communication Instead of 7 or 8 bits?
Where i modified the visa ini file to disable error replacement. This seemed to help out the problem - at least i got the data i expected, until i built an application.
When i build and EXE the problem occurs again when i run the application on another computer - not the computer on which it is built. I have made sure to include a customed configuration file, i.e. to include the modified visa.ini file to make sure that the error replacement is disabled in the application as well. When i look in the configuration file that follows the application it also appears as if the error replacement is disabled, however, it seems not to work since a lot of 0's are once again filled into my dataset.
I have of course made sure that the serial port setting are set up correctly.
I am using LV2013.
Anyone tried this before or is able to help somehow?
Thank you
Nicolai
10-29-2014 07:59 AM
Please attach some code so we can see what you are doing. Otherwise it is pure speculation.
Is it possible the other computer has a different kind of serial port or drivers that are preventing your method of sending a 9th bit from working?
How and where are you configuring the serial port?
10-29-2014 08:18 AM
Sure, I have attached my VI - however i'm not sure it provides any useful information in this case. What I am doing is simply reading a serial port every 750 ms, accessing specific data in it and plotting in graphs. The VI works perfectly fine on the development computer.
I don't think the VI or the serial port are what's preventing me. It seems like the configurations in my visaconf.ini file are not transferred to the deployment computer. I have tried the following from the knowledge base:
How to Include VISA Settings in a LabVIEW Installer
Why Does Serial Communication Not Work on my LabVIEW Deployed Executable?
Storing VISA Aliases and Moving Them to Another Machine
I can also see that the 'DisableErrorReplacement' parameter is set in the .ini file that comes along with the application, but it seems like it is not applied since I keep receiving all these annoying 0's that ruin my data.
As you can see in the VI i configure my serial port in the 'false'-state, and then on the development machine I have just added 'DisableReplacementError=1' to the visaconf.ini file which solved my problem before trying to distribute the app.
Hope some of you guys can help.
Best regards
Nicolai
10-30-2014 02:24 AM
bump.... anyone?
10-30-2014 09:18 AM - edited 10-30-2014 09:24 AM
One problem I see in your VI is that you don't wire the VISA reference wire through in every case. You have a Use Default if Unwired on the one tunnel. So whenever that inner loop stops, you close the port and the VISA reference becomes undefined. Maybe your code works okay, but the way you are using Switch Action on the Start and Stop buttons, and using Reinit to Default to reset those buttons seems very awkward.
I see you have the serial port for 8 data bits, 2 stop bits, and even parity. I only see you reading the data, but I don't see where you are doing anything special like it talks about in the link you posted earlier. Unless you are doing the byte manipulation in the subVI you didn't attach. Also, the article talks about using either mark or space parity, but you set yours for even parity.
So are you trying to do the things listed in the article? And if so, where are you doing it? It doesn't show up in the code you attached.
Perhaps an even more important question, what are you trying to communicate with that is using 9-bit data bytes? Is this some very old instrument?
PS: Why are you using Build Matrix in your VI instead of Build Array?
10-30-2014 09:34 AM
I have done all the things in the articles. Until now I have just modified the visaconf.ini file manually. But I have fixed the problem by finally locating the the visaconf.ini file on the deployment computer and inserting the key into it. I have added code so it's done programatically.
Yes, I am doing some data processing inside the subVI i did not attach.
I communicate with a FPGA which according to the specifications is using 8 databits. I simply don't know why the fix works, but it does. The only thing i could see was that a lot of 0's were filled into my data - the exact same error the knowledge base article described when parity errors appear. The serial port settings are set according to the specifications I have.
I use a build matrix, because that's what I want to do. Inside the subVI I build an array of data and then I build a matrix from the arrays.
But since I have fixed the problem I'll not take anymore of your time, thank you for replying and taking a look at my VI.
Best
Nicolai
10-30-2014 10:02 AM
I'm glad you got it working, but I don't understand why you were linking to an article about doing 9-bit communication when you now say you are using an FPGA using 8-bit data.
10-30-2014 01:35 PM
The data I was reading from the FPGA was wrong (not the problem i posted about). I observed that a lot of 0's were filled in between the data - exactly the problem they describe in the article. I used the fix they describe in the article I linked to - that's why I linked. I don't understand either why the 9bit fix works since the FPGA is using 8 bits (or, at least it is supposed to).
Now the problem I posted about was related to the fix not being transferred to the deployment PC when building an EXE file - but I found a solution.
I was probably not clear in my first post .
Best
Nicolai