10-18-2023 04:27 AM
Hello,
I'm trying to do set up a unit test with LabVIEW's "Unit Test Framework". I have a typedef cluster as input, one of its parameters is a double float value called "time". The problem is that LabVIEW doesn't take the value I'm setting in the test case, but uses a default value of 2 instead. How can I make LabVIEW use the value defined in the test case?
Greetings
ch33rs
10-18-2023 05:32 AM - edited 10-18-2023 05:33 AM
It's quite suspicious to me that the parameter is called "InputTime" in the test and "time" in the VI.
Other that that it's hard to say without being able to interact.
10-18-2023 05:55 AM
"InputTime" is just the name of an indicator I created to get the value for the screenshot. It is not influecing the test case. Is it possible that a set default value in the input typedef can result in such a behaviour? Is the test framework not able to overwrite the default value?