LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW vs. Python for Test Automation

I've been using LabVIEW for almost a decade. I lead a fairly large test automation project. I use NI Actor Framework and NI TestStand API in this LabVIEW project. I was able to put together a user-friendly and aesthetically pleasing GUI with smart UI logic and capability to create complex test sequences with many concurrent loops interacting with one another in a relatively short period of time. 

 

I can't imagine doing what I did with text-based languages as quickly. Granted, I have limited experience with text-based programming languages. Undergraduate level experience with C, Java and Python. But even if I was very proficient in, say, Python, I feel LabVIEW would be much more efficient for what I'm doing. I've read many comments from text-based language programmers about how much LabVIEW 'sucks'. Many times, I can tell their LabVIEW skills are simply low and don't know what they are doing wrong. You see as much as you know.

 

Now I would really like to hear what people who are experienced with both sides think.

 

Let's say this company has multiple teams that need test automation. Only one team hired a LabVIEW programmer just because the manager knew about LabVIEW and saw what it can do. The other teams got some engineers with Python skills with no exposure to LabVIEW. This naturally happens because there are many more skilled Python programmers than advanced LabVIEW programmers unless the management specifically aims to go the LabVIEW way. These teams work independently for years. One day, the upper management wants to come up with some common way of test automation across these teams. You are the lone LabVIEW programmer. The others are all Python programmers. You can see what you do with LabVIEW and TestStand is more complex and user-friendly but Python is totally free. How would you make a case? Is there any comprehensive third-party academic paper about these comparisons? I'd love to hear any relevant thoughts or stories.

Message 1 of 20
(15,075 Views)

I work as a full time test software developer. In my work I had to do LabVIEW, Python, C etc. These are only tools - if the only tool you know is a hammer then every problem looks like a nail. Usually the only people that will say things like:" ewww Labview is not a real programming language" or "ewwww LabVIEW sucks" are usually the ones that probably spent a few days with it and had the what I call 'LabVIEW shock'. Python is great but I would not use it for ATEs or complex data acquistion systems. C is great but I usually use it for level board code - eg STM32s etc. LabVIEW is the only platform that would allow me to develop you a whole DAQ/ real time/ fpga project in a few days/weeks from scratch. You might say: "hey it costs me 10k for all the licences" yes true, but you would still have to pay that as development costs as it would take much much longer to do the same with Python/C - if you put together the man hours it would take it probably ends up being more expensive - I do not know where you are from or what the hourly wages are. Every language is good at something but I find LabVIEW to be the best when it comes to complex hardware integration systems.

Message 2 of 20
(15,000 Views)

@Chickentree wrote:

This naturally happens because there are many more skilled Python programmers than advanced LabVIEW programmers unless the management specifically aims to go the LabVIEW way.

I've always wandered about that... Are there more skilled Python programmers? Or can anyone that made a snake game simply call him\herself an expert?

 

I really don't know... LabVIEW has some skill validation mechanism, does Python?

 

I'd say most LabVIEW programmers are technical inclined. Python programmers are probably more divers, Python being more 'general purpose'.

 


@Chickentree wrote:

You are the lone LabVIEW programmer. The others are all Python programmers. You can see what you do with LabVIEW and TestStand is more complex and user-friendly but Python is totally free. How would you make a case?


The prove is in the pudding: how well do both parties do with a real life challenge?

 

Also consider other aspects. Development type to create something is one thing. How much time would it take to get a new team member on board? You could actually hire an external >CLA LabVIEW programmer and a "skilled" Python programmer, ask them to do a task.

 

I don't think you'll get an objective answer here either. We can all through success stories at you, but it wouldn't be evidence.

 

Of course moving to a company that appreciates LabVIEW programmers is an option (but not an answer to the question).

0 Kudos
Message 3 of 20
(14,980 Views)

Using AF and TestStand certainly is non-trivial.  Has that effort in terms of the quick development and level of capability been appreciated?  Or do they stop at the "LabVIEW sucks" thing?

 

"Python is totally free" refers to the license.  That is a really small part of the life cycle of software development.  Before the community edition, LabVIEW carried a cost to entry which allowed for tools like Python to rise up.  I think Python is mostly going after Matlab users.

 

Generally speaking the Python and LabVIEW users tend to not have software engineering backgrounds.  Are the teams using source code control, task/issue tracking, requirements management?  If not the tool is not the problem.

 

There is a group of users that you may run into who blame the tool.  Ask them to tell you about stories where the project they were working on failed.  If they have no such stories, then that means they lack experience or for some reason are not sharing those experiences.  But if they have such a story, look for patterns.  Do they blame: a) others, b) the tool, c) management, d) themselves.

 

Another important topic is technical debt.  In several cases I have seen teams be enchanted by a new tool (could be LabVIEW, Python, etc) where the proverbial 40% of functionality is done in the form of a prototype in say 2-3 weeks.  They get excited and switch over to that tool and the rest of the so called 60% takes 4-5 months.  The reason is that the initial prototype had a lot of technical debt.  It was demo-able which is good but the debt was not recognized and the development time afterwards was spent adding new features but also paying down debt.

 

For me, management and engineering must understand the concept of technical debt.  It is a reality and I am not saying you never carry technical debt but you should know when you have it.  It is like saying "look, I own 5 houses" and not looking at the balance sheet to realize you owe a ton of money on those houses.  Code is similar.

 

My first concern with Python would be hardware interfacing.  In other words, there is a table of criteria that should be made with things like this that is objective and allows you to make a documented decision based on facts that everyone agrees on.

 

I do not know of any research papers where text vs graphical coding is compared.  Most of what I know is in terms of stories where using LabVIEW took 2-3 times less time than using C (text based approach).

 

In your case since there a bunch of python developers that may skew any analysis towards keeping them busy.  I think that is a risk because there will be a temptation to "let's get them going on things we need to do anyway" and then the big picture is lost.

 

I will leave you with NI Center of Excellence page: https://learn.ni.com/center-of-excellence this may have some info on best practices for teams development.


Certified LabVIEW Architect, Certified Professional Instructor
ALE Consultants

Introduction to LabVIEW FPGA for RF, Radar, and Electronic Warfare Applications
Message 4 of 20
(14,958 Views)

@Terry_ALE  ha scritto:

Using AF and TestStand certainly is non-trivial.  Has that effort in terms of the quick development and level of capability been appreciated?  Or do they stop at the "LabVIEW sucks" thing?

 

"Python is totally free" refers to the license.  That is a really small part of the life cycle of software development.  Before the community edition, LabVIEW carried a cost to entry which allowed for tools like Python to rise up.  I think Python is mostly going after Matlab users.

 

Generally speaking the Python and LabVIEW users tend to not have software engineering backgrounds.  Are the teams using source code control, task/issue tracking, requirements management?  If not the tool is not the problem.

 

There is a group of users that you may run into who blame the tool.  Ask them to tell you about stories where the project they were working on failed.  If they have no such stories, then that means they lack experience or for some reason are not sharing those experiences.  But if they have such a story, look for patterns.  Do they blame: a) others, b) the tool, c) management, d) themselves.

 

Another important topic is technical debt.  In several cases I have seen teams be enchanted by a new tool (could be LabVIEW, Python, etc) where the proverbial 40% of functionality is done in the form of a prototype in say 2-3 weeks.  They get excited and switch over to that tool and the rest of the so called 60% takes 4-5 months.  The reason is that the initial prototype had a lot of technical debt.  It was demo-able which is good but the debt was not recognized and the development time afterwards was spent adding new features but also paying down debt.

 

For me, management and engineering must understand the concept of technical debt.  It is a reality and I am not saying you never carry technical debt but you should know when you have it.  It is like saying "look, I own 5 houses" and not looking at the balance sheet to realize you owe a ton of money on those houses.  Code is similar.

 

My first concern with Python would be hardware interfacing.  In other words, there is a table of criteria that should be made with things like this that is objective and allows you to make a documented decision based on facts that everyone agrees on.

 

I do not know of any research papers where text vs graphical coding is compared.  Most of what I know is in terms of stories where using LabVIEW took 2-3 times less time than using C (text based approach).

 

In your case since there a bunch of python developers that may skew any analysis towards keeping them busy.  I think that is a risk because there will be a temptation to "let's get them going on things we need to do anyway" and then the big picture is lost.

 

I will leave you with NI Center of Excellence page: https://learn.ni.com/center-of-excellence this may have some info on best practices for teams development.


I agree as regards Python.

Python is just hyped now because IA / IoT / Big Data is hyped, and Python has some libraries/algoritms already done, so the typical kid can do something in few days.

 

You are nonetheless totally wrong in general because in .NET you have Measurement Studio which is equally easy to use, and even better than labview because you can hire a standard .NET guy and he would just use something like:

 

mydevice.Open();

var data = mydevice.AcquireDaqMx();

mydevice.Close();

 

this is totally fine and fast.

0 Kudos
Message 5 of 20
(14,161 Views)

I agree, you can find more people who know .NET than LabVIEW.  Do you find that .NET developers understand the equipment being controlled?

 


Certified LabVIEW Architect, Certified Professional Instructor
ALE Consultants

Introduction to LabVIEW FPGA for RF, Radar, and Electronic Warfare Applications
Message 6 of 20
(14,136 Views)

Chiming in on this discussion very late, but I felt I should as it's very relevant to my situation and still relevant years after this post was created.

 

Currently, I'm the only test engineer at the company leading a project to develop an end-to-end test solution which solves several problems in manufacturing either low in scope or large in scope.

 

There are several other developers at the company who represent the Python side of the argument. The test engineer I worked with previously at the company started here as their first role in test, so they were split between the LabView and Python sides of the argument. They've since left the company, leaving me in control of the entire scope of manufacturing test development and support.

 

Now, the level of skill I've seen here which represents the totality of LabView implementations prior to my coming was pretty low.  Essentially, all of the LabView programs were written with race conditions baked into the framework. They used indicators and property nodes as memory for everything, and had no concept of implementing functional global variables or shift registers effectively. The resulting impression this gave everyone without LabVIEW experience was "all LabView programs suck", because they all came with baked in delays to prevent race conditions. The GUIs ran extremely slow due to the enormous number of indicators.

 

Now, on the python side, because there are several individuals with either a high to low level of skill across departments, the conversation becomes skewed. The only way I can represent the value provided by LabView is through practical demonstration and stubbornly refusing to use python when a LabView implementation would be simply easier and quicker to implement. No I don't have the time to write a white paper, or look-up effective research papers to back up my case. I have work to do here.

 

Since I started at the company (less than 2 years ago), I developed my own implementation of an Actor Framework which utilizes the QDSM tools to integrate several test instruments. I developed a test scripting language which works effectively in-place of test stand (at least at a basic level) and I've created a standardized / highly modularized hardware testing platform from scratch. There's no way any test-based language would've enabled me to handle all of that in the same timeframe.

 

In my experience, Python integrates into LabView via the Python Node very well. There are some situations, where a tool which creates an intuitive connecting layer between either Labview or a some other framework is required in both the test as well as the development departments of a company. This is the only situation I've discovered where Python was a requirement. The debug interface via RS232 could be too low-level to be efficiently implemented in LabView. In a case like this, it is better to create a layer which collects low-level data and places it into a JSON field. Python is a good option for this, as it creates a tool which easily connects to labview's JKI JSON vi as well as any debug interface tool which doesn't interface with test equipment at all.

 

The concept of technical-debt previously brought up is 100% accurate and especially important for test development. The manufacturing test floor is in some ways, like going to the moon. Once a test tool is deployed to the line, the cost of simply getting access to that tool to perform any changes or fixes retroactively is much greater over time than if you were to take some foresight into account during its early development.

 

Yes, you could build a test environment on the cheap in a short-time, but I promise you will pay an immeasurable debt over the years as you struggle to keep those tools in accordance with your test requirements. Manufacturing is very much a dynamic environment, and an incredibly dynamic environment DEMANDS tooling which can react quickly and efficiently while also being highly precise and accurate. Good luck getting one test engineer to meet those requirements if they are under the impression that LabView "sucks".

Message 7 of 20
(7,804 Views)

I constructed three very sophisticated Test Automation systems in LabVIEW and previously 1 in C and 1 in C++. The biggest problem is that these are all proprietary to the company that paid for them. So, for each new effort, I have to start at the bottom and build everything from scratch each time. The result is immense duplication of effort that doesn't contribute value to me or the clients. This time, I'm writing a new system on my own that I can use as an open source platform for client development. Maybe I'll post it on GitHub. But this is all contingent on completing development before money runs out.

 

Prior to LabVIEW 8.6, I favored C++. But, after NI added OOP, VI Server and recursion, LabVIEW became viable for my purposes. Easier implementation of concurrency and multi-core support are among the big advantages that LabVIEW has vs. Python. My new system will leverage these heavily for strong capabilities and high performance.

Message 8 of 20
(7,779 Views)

@Chickentree wrote:

I've been using LabVIEW for almost a decade. I lead a fairly large test automation project. I use NI Actor Framework and NI TestStand API in this LabVIEW project. I was able to put together a user-friendly and aesthetically pleasing GUI with smart UI logic and capability to create complex test sequences with many concurrent loops interacting with one another in a relatively short period of time. 

 

I can't imagine doing what I did with text-based languages as quickly. Granted, I have limited experience with text-based programming languages. Undergraduate level experience with C, Java and Python. But even if I was very proficient in, say, Python, I feel LabVIEW would be much more efficient for what I'm doing. I've read many comments from text-based language programmers about how much LabVIEW 'sucks'. Many times, I can tell their LabVIEW skills are simply low and don't know what they are doing wrong. You see as much as you know.

 

Now I would really like to hear what people who are experienced with both sides think.

 

Let's say this company has multiple teams that need test automation. Only one team hired a LabVIEW programmer just because the manager knew about LabVIEW and saw what it can do. The other teams got some engineers with Python skills with no exposure to LabVIEW. This naturally happens because there are many more skilled Python programmers than advanced LabVIEW programmers unless the management specifically aims to go the LabVIEW way. These teams work independently for years. One day, the upper management wants to come up with some common way of test automation across these teams. You are the lone LabVIEW programmer. The others are all Python programmers. You can see what you do with LabVIEW and TestStand is more complex and user-friendly but Python is totally free. How would you make a case? Is there any comprehensive third-party academic paper about these comparisons? I'd love to hear any relevant thoughts or stories.


Since this thread was resurrected recently maybe we can re-visit this question. I think one of the more compelling arguments for using LabVIEW has always been the speed of development. The LabVIEW ecosystem was made to create T&M software, so by design it should enable rapid development of T&M systems. For may years LabVIEW has been able to hold the rapid development title but will it last forever? 

 

In very recent times AI code generation has gotten to the point it is actually useful and its getting better and better every day. Its hard to know the future but I know today I can use github copilot to write python code at least twice as fast as I was writing code last year. The AI suggested code is generally as good or better than code I would write without the AI help. So in a year from now where does that leave LabVIEW. 

 

I think that if LabVIEW does not get AI code generation it will soon ( if not already)  fall behind text based languages in speed of development. 

 

______________________________________________________________
Have a pleasant day and be sure to learn Python for success and prosperity.
0 Kudos
Message 9 of 20
(7,758 Views)

@Jay14159265 wrote:

I think that if LabVIEW does not get AI code generation it will soon ( if not already)  fall behind text based languages in speed of development. 


Ever heard of Fuzzy logic? It was the BIG thing some 20 to 25 years ago, and it seemed that you could not sell a coffee machine or washing machine anymore if there wasn't a "uses Fuzzy logic for a better cleaning result (or better tasting coffee)" label on it. There also frequently came the argument that it would preserve energy and detergents as it only was using just the right amount of energy and time.

 

Fuzzy logic was in principle just a bit of an adaptive algorithm and a pretty crude one at that too. More adaptable than a pure on/off control, but all of course depended on the sensors, if they were even present. It's main advantage was that you could "tune" it with some simple trial and error, while for PID controllers you needed a deeper understanding of the entire control loop and its parameters to be able to make it work reliable.

And quite frequently that fuzzy logic was just a little control loop somewhere in the firmware to be able to claim that the device used this new great super power technology, and sometimes it was just a label put on the device anyways, freely according to the motto "paper is very patient", and who in the world would ever go to the effort to take apart a device to make sure it actually contained this super tech?

 

I'm not saying that AI is without any merits. But much of it is hype anyways, and a self supporting hype at that. Someone found out this new invisible cloth and everybody is wanting to jump on the bandwagon to gain a bit of profit too, and nobody so far has dared to say: "But he is not wearing any clothes!"

 

Almost nobody seems to worry about the fact that the hardware on which ChatGPT and similar systems run seems to head into the next energy fiasko after  many of those blockchain technologies have already set a new "standard" in energy consumption. Isn't it crazy that a bitcoin sale consumes 707 kWh/transaction. That's a quart year of energy consumption for an average household here in the Netherlands! And ChatGPT and similar technologies seem to be quite large energy consumers too. It may be less than a 1/10 of a Wh per query but if you add that up with the million of queries that come in, they must be using an electric power plant already just for themselves.

Rolf Kalbermatter
My Blog
Message 10 of 20
(7,720 Views)