07-25-2023 07:44 PM
Does anyone know how to parse (and evaluate parsed) formulas that have array inputs and, more generally speaking, array outputs?
The formula node permits array inputs and/or array outputs, however the formula node must be defined at compile time, and my attempts to pass the same formula to the Parse Formula Node VI errors stating "wrong variable name" when I include square brackets in the formula to index a variable. In the application I am developing, I want the user to be able specify any formula during runtime, rather than them being restricted to what I predefine. For scalars, using the LabVIEW Parse and Evaluate Parsed Formula Node or Formula String VIs has proved adequate.
I have found the muParserX library, which I might play around with, but would prefer to use LabVIEW VIs for simplicity (indeed, for scalars, despite my testing demonstrating that muParser significantly outperforms and offers more flexibility than LabVIEW VIs, I still went with the latter for that reason).
Thanks.
Solved! Go to Solution.
07-25-2023 08:28 PM
What are you trying to achieve?
07-25-2023 10:25 PM - edited 07-25-2023 10:31 PM
Hi Santhosh,
Elaborating, the application I am developing is for general data acquisition and control, and thus one of the premises is that I have no knowledge of what users may want to compute in real-time from the data being acquired, and thus ideally the user interface should offer flexibility to do any type of computation based on any acquired channel(s).
The case I have handled so far is when computations are based on only the most recently acquired data, which the user types in as a formula and assigns acquired data channels to each variable (I did this using "Parse Formula String.vi" and "Eval Parsed Formula String.vi"). However, there is also interest in being able to perform real-time computations based on a moving window of acquired data, such as moving averages, least squared fit slopes, etc. Therefore, I was hoping to generalise what I have done to be able to evaluate expressions based on array and scalar inputs to compute single scalar outputs (one output per expression). As mentioned in my original post, generalising further, one might also consider array outputs, but, in my application, I can say it will be limited to scalar outputs, and I will add that I have no intention of supporting recursive (IIR) filters at this point.
07-25-2023 10:55 PM
Better approach would be to create a python file with a default function with known input and output types.
The user can update the implementation of this python function to achieve whatever required and return the data. All that you've to do in LV code, is to invoke this python file with the DAQ data as input and receive the processed data. This way, the user can do any complex level of signal processing without knowing LabVIEW. Python is relatively easy to learn.
07-26-2023 08:36 PM
Inferring, I suppose LabVIEW doesn't include the functionality I hoped for, which was fundamentally my query.
Python is one solution, but it seems calling it from LabVIEW is relatively slow, so potentially not viable for high throughput applications (I benchmarked one of the LabVIEW examples for calling Python and found equivalent LabVIEW code to be almost 100k times faster when using numeric palette operations, and 1k times faster when using the parse and evaluate formula VIs).
In any case, I guess I'll start to compare different options for calling external code.
Thanks, Santhosh.
07-26-2023 09:11 PM
Of course Python is comparatively slow but easy to learn and implement.
On the other hand, formula node as well is slow compared to native LabVIEW implementation. There is always a compromise.
08-03-2023 12:48 AM - edited 08-03-2023 01:42 AM
As a final update for those who come across this thread, I ran some benchmarking tests comparing different mathematical expression parsers, and whilst it wasn't extensive, the result was ExprTk had the greatest performance, which is consistent with others who have done these sorts of comparisons (for example, https://github.com/ArashPartow/math-parser-benchmark-project). Moreover, incredibly, when computing a least squares fit, ExprTk achieved very similar performance to, if not slightly better than, LabVIEW's formula node and NI_AALPro.lvlib:Linear Fit.vi, despite the latter two being compiled code. Therefore, considering my requirements of maximising general applicability, ExprTk seems the best choice (note that it did require developing a c++ wrapper to build a dll that I could call using the LabVIEW Call Library Function Node).
08-03-2023 02:00 AM
I would very much expect that TkExp does its standard functions such as LSF by calling a precompiled function too, possibly even the Intel Math Kernel Library, just as what the NI Advanced Analysis Library does.
08-03-2023 05:11 PM
That may be so, but unless I'm missing something, I expected machine code that had been built to perform a single predefined mathematical function would always have the potential to outperform runtime expression parsers, and so whilst I regarded the performance of the former to indicate an upper limit for the latter, realistically I never expected that upper limit would be reached, but apparently ExprTk is capable of doing so (at least in some circumstances).
08-04-2023 02:19 AM - edited 08-04-2023 02:19 AM
It's mostly a question of relativity. Executing only one instruction (calling a function) makes very little difference, no matter if it is in a runtime scripting environment or in a compiled machine code program. If the function that is called is in comparison fairly computational intense, all you see in your timing measurements is the execution time of that function. If you however start to do large loops and other complex stuff on the scripting level then that will really tend to dominate your time measurements.