LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Parse and evaluate parsed formulas involving arrays

Solved!
Go to solution

Does anyone know how to parse (and evaluate parsed) formulas that have array inputs and, more generally speaking, array outputs?

 

The formula node permits array inputs and/or array outputs, however the formula node must be defined at compile time, and my attempts to pass the same formula to the Parse Formula Node VI errors stating "wrong variable name" when I include square brackets in the formula to index a variable. In the application I am developing, I want the user to be able specify any formula during runtime, rather than them being restricted to what I predefine. For scalars, using the LabVIEW Parse and Evaluate Parsed Formula Node or Formula String VIs has proved adequate.

 

I have found the muParserX library, which I might play around with, but would prefer to use LabVIEW VIs for simplicity (indeed, for scalars, despite my testing demonstrating that muParser significantly outperforms and offers more flexibility than LabVIEW VIs, I still went with the latter for that reason).

 

Thanks.

0 Kudos
Message 1 of 13
(1,627 Views)

What are you trying to achieve?

Santhosh
Soliton Technologies

New to the forum? Please read community guidelines and how to ask smart questions

Only two ways to appreciate someone who spent their free time to reply/answer your question - give them Kudos or mark their reply as the answer/solution.

Finding it hard to source NI hardware? Try NI Trading Post
0 Kudos
Message 2 of 13
(1,604 Views)

Hi Santhosh,

 

Elaborating, the application I am developing is for general data acquisition and control, and thus one of the premises is that I have no knowledge of what users may want to compute in real-time from the data being acquired, and thus ideally the user interface should offer flexibility to do any type of computation based on any acquired channel(s).

 

The case I have handled so far is when computations are based on only the most recently acquired data, which the user types in as a formula and assigns acquired data channels to each variable (I did this using "Parse Formula String.vi" and "Eval Parsed Formula String.vi"). However, there is also interest in being able to perform real-time computations based on a moving window of acquired data, such as moving averages, least squared fit slopes, etc. Therefore, I was hoping to generalise what I have done to be able to evaluate expressions based on array and scalar inputs to compute single scalar outputs (one output per expression). As mentioned in my original post, generalising further, one might also consider array outputs, but, in my application, I can say it will be limited to scalar outputs, and I will add that I have no intention of supporting recursive (IIR) filters at this point.

0 Kudos
Message 3 of 13
(1,586 Views)
Solution
Accepted by topic author banksey255

Better approach would be to create a python file with a default function with known input and output types.

 

The user can update the implementation of this python function to achieve whatever required and return the data. All that you've to do in LV code, is to invoke this python file with the DAQ data as input and receive the processed data. This way, the user can do any complex level of signal processing without knowing LabVIEW. Python is relatively easy to learn.

Santhosh
Soliton Technologies

New to the forum? Please read community guidelines and how to ask smart questions

Only two ways to appreciate someone who spent their free time to reply/answer your question - give them Kudos or mark their reply as the answer/solution.

Finding it hard to source NI hardware? Try NI Trading Post
Message 4 of 13
(1,575 Views)

Inferring, I suppose LabVIEW doesn't include the functionality I hoped for, which was fundamentally my query.

 

Python is one solution, but it seems calling it from LabVIEW is relatively slow, so potentially not viable for high throughput applications (I benchmarked one of the LabVIEW examples for calling Python and found equivalent LabVIEW code to be almost 100k times faster when using numeric palette operations, and 1k times faster when using the parse and evaluate formula VIs).

 

In any case, I guess I'll start to compare different options for calling external code.

 

Thanks, Santhosh.

0 Kudos
Message 5 of 13
(1,541 Views)

Of course Python is comparatively slow but easy to learn and implement.

 

On the other hand, formula node as well is slow compared to native LabVIEW implementation. There is always a compromise.

Santhosh
Soliton Technologies

New to the forum? Please read community guidelines and how to ask smart questions

Only two ways to appreciate someone who spent their free time to reply/answer your question - give them Kudos or mark their reply as the answer/solution.

Finding it hard to source NI hardware? Try NI Trading Post
0 Kudos
Message 6 of 13
(1,530 Views)

As a final update for those who come across this thread, I ran some benchmarking tests comparing different mathematical expression parsers, and whilst it wasn't extensive, the result was ExprTk had the greatest performance, which is consistent with others who have done these sorts of comparisons (for example, https://github.com/ArashPartow/math-parser-benchmark-project). Moreover, incredibly, when computing a least squares fit, ExprTk achieved very similar performance to, if not slightly better than, LabVIEW's formula node and NI_AALPro.lvlib:Linear Fit.vi, despite the latter two being compiled code. Therefore, considering my requirements of maximising general applicability, ExprTk seems the best choice (note that it did require developing a c++ wrapper to build a dll that I could call using the LabVIEW Call Library Function Node).

0 Kudos
Message 7 of 13
(1,474 Views)

I would very much expect that TkExp does its standard functions such as LSF by calling a precompiled function too, possibly even the Intel Math Kernel Library, just as what the NI Advanced Analysis Library does.

Rolf Kalbermatter
My Blog
0 Kudos
Message 8 of 13
(1,455 Views)

That may be so, but unless I'm missing something, I expected machine code that had been built to perform a single predefined mathematical function would always have the potential to outperform runtime expression parsers, and so whilst I regarded the performance of the former to indicate an upper limit for the latter, realistically I never expected that upper limit would be reached, but apparently ExprTk is capable of doing so (at least in some circumstances).

0 Kudos
Message 9 of 13
(1,433 Views)

It's mostly a question of relativity. Executing only one instruction (calling a function) makes very little difference, no matter if it is in a runtime scripting environment or in a compiled machine code program. If the function that is called is in comparison fairly computational intense, all you see in your timing measurements is the execution time of that function. If you however start to do large loops and other complex stuff on the scripting level then that will really tend to dominate your time measurements.

Rolf Kalbermatter
My Blog
0 Kudos
Message 10 of 13
(1,420 Views)