10-29-2007 05:09 AM
10-29-2007 06:51 AM
07-23-2023 02:09 PM
I realise this thread is 16 years old but I had the same problem and eventually found a solution.
I noticed that if I did 'variant to flattened string' on the entire variant, I was getting the Unicode hex values, but the most significant byte was being lost on conversion to any sort of string array (even Unicode string array).
So I used 'variant to data' to convert to a 2d array of variant, then did variant to flattened string on each element, converted that to UTF16-LE, removed a few bytes and then it displays as Unicode correctly. This is with "UseUnicode=TRUE" in the LabVIEW.ini file.
Perhaps there is a neater way, but I haven't found it yet.
I have attached an example with an Excel file to test it in. Hope this helps someone!
Leah