LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LV_Abstract Open Source OOP Library


wiebe@CARYA wrote:

 


@Intaris wrote:
But as a global resource (seeing how LV deals with classes) it can quickly lead to dependency problems and maintenance explosion. Just my 2c.

I don't get the global resource part. Isn't it a by wire library?

 

Or do you mean like a global dependency (used everywhere)?

 


The inheritance tree itself is globally defined.

 

You cannot implement two sets of children for the datatypes without implicitly allowing them to be used interchangeably outside of the actual context in which you intend to use them. Actually trying to enforce this based off having a common ancestor is very tricky.

 

If I make a set of SGL and DBL child objects which implement specific functions I need, the only option I have to make a single connector pane is "Numeric". But that then also includes many classes for which I have NOT included the required functionality. In addition, if I have OTHER classes which implement completely different extensions, they also share the same inheritance hierarchy. My "numeric" input which is meant to only take SGL A and DBL A now take, without any complaint from the IDE, I31 B, i16C, SGL B and so on. I only find this out some time at run-time. My ability to actually limit the available datatpyes which are compatible with the VI is rendered impossibly by having a globally defined common ancestor. If I would implement "Parallel" inheritance hierarchies, then I can limit the datatypes as I see fit. I could then have an A inheritance tree (With only the classes implemented which are valid int he context of A) and a separate inheritance B which does the same for context B.

 

You have "numeric" as a perfectly valid parent for ALL implemented SGL, DBL, CDB and so on classes over the entirety of your project. It is this "global" effect I am referring to.

0 Kudos
Message 21 of 26
(266 Views)

I suppose you could make a numeric interface, a scalar interface, a float interface, signed interface, etc.

 

That way you could be selective.

 

Not sure how that will turn out (usually a sign of upcoming headache).

0 Kudos
Message 22 of 26
(258 Views)

This library can be as typed as you want. I did a quick drawing of the relationships below. I did implement polymorphism but old school LabVIEW polymorphism so the error check here is development time.

 

LV_Tim_1-1710723489489.png

 

0 Kudos
Message 23 of 26
(231 Views)

@LV_Tim wrote:

This library can be as typed as you want. I did a quick drawing of the relationships below. I did implement polymorphism but old school LabVIEW polymorphism so the error check here is development time.


In the current hierarchy you can't, for instance, make a VI that accepts 'signed'+'numerics'. (I8, I16, I32, I64, floats, maybe waveforms) or a VI that allow only 'unsigned': U8, U16, U32, U64, enums. Or x32: I32, U32, enum32.

 

The problem escalades if you add arrays. You might want to allow only strings: string scalars, string arrays.

 

An interface hierarchy would allow to do this, by making signed integers and floats inherit from "signed.lvclass". But it will be tedious, esp. when you add array support. You'd need a class for each array type, or you won't be able to specify it's type.

 

This would allow to be selective by using compile type polymorphism (the best kind).

 

You could say C++ templates (for instance) can do this. The comparison isn't really fair though. C++ (Class) templates are compile time objects.  LabVIEW would be able to do compile time polymorphism with .vims (and be selective on supported types). But .vims are functions, and there are no malleable classes. I don't see that change soon.

0 Kudos
Message 24 of 26
(210 Views)
In the current hierarchy you can't, for instance, make a VI that accepts 'signed'+'numerics'. (I8, I16, I32, I64, floats, maybe waveforms) or a VI that allow only 'unsigned': U8, U16, U32, U64, enums. Or x32: I32, U32, enum32.

 

I personally have not found a reason to filter by signed and unsigned types. The separation between Integers and Floats is because logic functions can be applied to integers but not floats and this feature is used in my logic classes. Though if the sign filtering is needed a simple fix is to add a 'unsigned' interface to all of the unsigned classes.

 


 

This would allow to be selective by using compile type polymorphism (the best kind).

 


I purposely avoided vim's for this library, because of Darren Nattinger's Ludicrous ways to fix broken LabVIEW code.

https://www.youtube.com/watch?v=HKcEYkksW_o.

Not that I don't use them, they saved a lot of work in the unit testing. but I like how the dev time polymorphism giving me immediate feedback that what I am doing isn't going to work.

 

0 Kudos
Message 25 of 26
(164 Views)

I have just added a set of concrete classes that extends the LV_Abstract classes to added programmatic shared variables to the abstraction.

https://www.vipm.io/package/tsa_lib_lvabs_shared_variable/

 

this is also free and open source.

https://github.com/timstreeter/conc_SharedVariable

 

0 Kudos
Message 26 of 26
(162 Views)