LabVIEW Idea Exchange

Community Browser
cancel
Showing results for 
Search instead for 
Did you mean: 
Post an idea

The LabVIEW Robotics Module consists of a variety of sensor and actuator drivers, motion algorithms (such as kinematics), world map creation and search algorithms, a world simulator, etc.

 

It hasn't been updated since 2019 and is 32-bit only.

 

I would like to see it made open source.

I believe some or all of the sensor drivers are already available on ni.com/idnet.

There are several other VI-based components and examples that are standalone and could be easily released independently.

I realize that the simulator might have some 3rd party constraints for releasing as open source, but I'd love to see it released if possible.

The recently introduced Raspberry Pi is a 32 bit ARM based microcontroller board that is very popular. It would be great if we could programme it in LabVIEW. This product could leverage off the already available LabVIEW Embedded for ARM and the LabVIEW Microcontroller SDK (or other methods of getting LabVIEW to run on it).

 

The Raspberry Pi is a $35 (with Ethernet) credit card sized computer that is open hardware. The ARM chip is an Atmel ARM11 running at 700 MHz resulting in 875 MIPS of performance. By way of comparison, the current LabVIEW Embedded for ARM Tier 1 (out-of-the-box experience) boards have only 60 MIPS of processing power. So, about 15 times the processing power!

 

Wouldn’t it be great to programme the Raspberry Pi in LabVIEW?

Problem

Many times, the bulk of LabVIEW development happens on computers that will never interface with hardware. A dozen engineers may be collaborating on code that will ultimately run on a dedicated machine somewhere, that is connected. Yet, as things currently are, I have to install more than I need on my development machine to get access to API VIs. If I am working on my laptop on an application with DAQ, RF, Spectrum analyzer, etc. components, I have to choose to either download and install all of that, or deal with missing VIs and broken arrows. This seems needless, since my particular machine will never actually interface with the hardware.

 

Idea

I would like to have the option to install only the LabVIEW VIs and ignore the driver itself. In many, if not most cases, the LabVIEW API could be independent of driver version. It could install very quickly, since it would just be a set of essentially no-op VIs. I don't care that the VIs would do nothing. They would just be placeholders for my development purposes. This would allow me to have full API access to develop my code without having to carry around large driver installations that I will never actually use.

I have used labview for a long time and avid user.  One issue I have been hitting lately is the "LabVIEW everywhere" slogan never really panned out, it has become LabVIEW everywhere NI allows it to be.  I am getting jealous of the Arduino and Rasberry Pi and hundreds of PICS and ARMs not avaliable to me (Yes I have the pro liscence but not embedded).  I wish Labview pro opened up the toolchain and started porting to many other platforms by default.  I am seeing jobs that labview is loosing ot to where it should be much more competetive like the embedded market. 

 

Essentially I am looking to see the Labview development environment easily work with toolchains for the most popular processors and also open up a simple standard to add targets to projects. 

 

Wouldnt it be nice to program a $25 ardunio dirrectly from labview (NO THIS IS NOT WHAT THE TOOLKIT IS DOING).  Add a Ardunio target file (maps the io memory to variables and throw down a loop, boolean shift register, a wait and a digital line variable, download to the micro and the blink led example is done.  Really open up the doors for LabVIEW everywhere.

 

 

 

The Arduino Due is a 32 bit ARM based microcontroller board that is destined to be very popular. It would be great if we could programme it in LabVIEW. This product could leverage off the already available LabVIEW Embedded for ARM and the LabVIEW Microcontroller SDK.

 

The Arduino Due is currently in developer trials and is due out later this year. It is expected to be about $50 and is open hardware. The ARM chip is an Atmel SAM3X8E ARM Cortex M3 running at 84 MHz resulting in 100 MIPS of performance. By way of comparison, the current LabVIEW Embedded for ARM Tier 1 (out-of-the-box experience) boards have only 60 MIPS of processing power.

 

The Arduino brand has an enormous following and Google has selected the Arduino Due for their recently introduced (28 June 2012) Accessory Development Kit for Android mobile phones and tablets (the ADK2012).

 

(By the way, the currently-available LabVIEW Arduino toolkit does not target the Arduino (and couldn’t since the Arduino Uno uses only an 8 bit microcontroller). Instead there is fixed C code running on the Arduino to transfer peripheral information to the serial port and back. That is, none of the LabVIEW target code executes on the Arduino. This idea is for LabVIEW code developed on a desktop to be transferred and execute on the target Arduino Due.)

 

Wouldn’t it be great to programme the Arduino Due in LabVIEW?

I distribuute a lot of code, and sometimes it's difficult to tell my users what they need to install in order to run that code.  It would be nice if I (or a user) could run a built in LabVIEW utility that tells me what a given VI needs to run.

 

For example, do I need DAQmx, Mathscript, Robotics?

This idea was submitted by a user who requested I post this for them.

 

As of now, the VISA resource controls only allow you to select resource names without their full Windows description.  You can select individual COM ports (COM3, COM5, etc), or pick from a list of alias names if you've defined aliases for your com ports.  But, it might be nice to give the user a configurable option which will provide the additional descriptive information that you can find in Windows Device Manager.  This could allow novice users to select the desired COM port based on the actual physical layer needed for the application.  Again, I'm pretty sure you can work around this by reviewing the different COM ports in Measurement & Automation Explorer, or even creating your own aliases to surface the additional information.  But, if I'm creating an executable, to be used on different systems by novice users, then I may not want them to have to go into MAX to be able to properly identify their desired port.

 

So, instead of asking the user to select a COM port from a list of items looking like this...

travisferguson_0-1641925866067.png

 

Maybe give an option in a property page for the VISA Resource Control that might look like this (this is a mark-up)...

 

travisferguson_1-1641925926951.png

so that an operator can pick from a more descriptive list like what you see in the Windows Device Manager...

travisferguson_2-1641925994204.png

 

Thank you,

 

 

 

 

The BeagleBoard xM is a 32 bit ARM based microcontroller board that is very popular. It would be great if we could programme it in LabVIEW. This product could leverage off the already available LabVIEW Embedded for ARM and the LabVIEW Microcontroller SDK (or other methods of getting LabVIEW to run on it).

 

The BeagleBoard xM is $149 and is open hardware. The BeagleBoard xM uses an ARM Cortex A8 running at 1,000 MHz resulting in 2,000 MIPS of performance. By way of comparison, the current LabVIEW Embedded for ARM Tier 1 (out-of-the-box experience) boards have only 60 MIPS of processing power. So, about 33 times the processing power!

 

Wouldn’t it be great to programme the BeagleBoard xM in LabVIEW?

This is something a few power users have asked me about. There's no Instrument Driver or VIPM Idea Exchange, so I thought I would post it here.


What if VIPM could manage Instrument Drivers from IDNet?
There are a few key benefits this would offer us...

  • download IDNet drivers directly from VIPM 
  • track which version of a driver you are using for different projects and revert when necessary 
  • wrap up ID dependencies in a VIPC file for use at a customer site
Install Other Version.png
Get Info.png 

With the advent of the IoT and the growing need to synchronize automation applications  and other TSN (Time Sensitive Networking) use cases UTC (Coordinated Universal Time) is becoming more and more problematic.  Currently, there are 37 seconds not accounted for in the TimeStamp Which is stored in UTC.  The current I64 portion of the timestamp datatype that holds number of seconds elapsed since 0:00:00 Dec 31, 1900 GMT is simply ignoring Leap Seconds.  This is consistent with most modern programming languages and is not a flaw of LabVIEW per se but isn't quite good enough for where the future is heading   In fact, there is a joint IERS/IEEE working group on TSN 

 

Enter TAI or International Atomic Time: TAI has the advantage of being contiguous and is based on the SI second making it ideal for IA applications.  Unfortunately, a LabVIEW Timestamp cannot be formated in TAI.   Entering a time of 23:59:60 31 Dec 2016, a real second that did ocurr, is not allowed.  Currently IERS Bulletin C is published to Give the current UTC-TAI offset but, required extensive code to implement the lookup and well, the text.text just won't display properly in a %<>T or %^<>T (local abs time container and UTC Abs time container)  We need a %#<>T TAI time container format specifier. (Or soon will!)

taking ideas like that seen here  , I got this idea: 

 

 

When working with "Path Constant", which works with a path too long, we observe look like this:

path2editado.PNG

Would be much nicer if we could double-click the “Path Constant” for us to see something like this:

path3 editado.PNG

The idea is this.

path idea labview editado.PNG

According to the increasing number of questions about this communication protocol, it would be time to rewrite the MODBUS library. I also suggest to add it to the NI device drivers installer.

 

This could be the place to list the expected modifications. Some comments and bugs are already listed in above linked page.

In the drivers (DAGmx,NI-Scope/Fgen,...) 

If a value (usually as DBL)  enters a config vi,  it could be coerced by the driver.

That vi should directly output the actual value.

 

Most often seen in the Forum: A user declares a non valid DAQmx Timing Sampleclock and wonder why the wfrm(data) is not what he expected.

Yes, RTFM help, you can read the properties after configuration  but it would easy to add an output to the config vi with the actual (maybe coerced) value. Making the user avare of that 'feature' (instead of popping an error 😉 )

 

Should not only cover samplerate ,  other values like FGEN standard funktion parameters (frequency, Phase), actual range, ...

How is NI-MAX still so bad after all these years? It goes completely unresponsive and crashes at the slightest provocation. This feels like it should be NI's bread-and-butter.

 

Every LabVIEW team ends up recreating so much of the functionality that NI-MAX is supposed to offer out of the box. Maybe with the discontinuation of NXG, some resources should be allocated to making this product usable.

 

TurboPhil_1-1629324790878.png

 

 

KB articles like this and this probably shouldn't need to exist.

There are a plethora of timestamp formats used by various operating systems and applications. The 1588 internet time protocol itself lists several. Windows is different from various flavors of Linux, and Excel is very different from about all of them. Then there are the details of daylight savings time, leap years, etc. LabVIEW contains all the tools to convert from one of these formats to another, but getting it right can be difficult. I propose a simple primitive to do this conversion. It would need to be polymorphic to handle the different data types that timestamps can take. This should only handle numeric data types, such as the numeric Excel timestamp (a double) or a Linux timestamp (an integer). Text-based timestamps are already handled fairly well. Inputs would be timestamp, input format type, output format type, and error. Outputs would be resultant format and error.

Hi!

Since National Instruments offers support for programming ARM microcontrollers, I think it would be great to start supporting programming very popular recently BeagleBone development board made by Texas Instruments. You can find more information about this device there: http://beagleboard.org/bone . Please kudo it 🙂

NXG needs an Idea Exchange.  The feedback button is a lame excuse for a replacement.  Why?

 

  • I can't tell if my idea has been suggested before.  (And maybe someone else's suggestion is BETTER and I want to sign onto it, instead.)
  • NI has to slog through bunches of similar feedback submissions to determine whether or not they are the same thing.
  • Many ideas start out as unfocused concepts that are honed razor sharp by the community.
  • This is an open loop feedback system.

Let's make an Idea Exchange for NXG!

The VISA test panel is a very valuable tool for troubleshooting instrument connectivity issues.

 

This used to be included with the VISA runtime, or at least with any installer that also included the VISA runtime.

 

Now I have to separately download and install the FULL VISA just to get this valuable tool. 

 

That makes installing a LabVIEW executable a multistep process as now I have to run two different installers. 

 

NI-MAX and the VISA test panel should ALWAYS BE included in any installer that includes the VISA runtime.

Currently LabVIEW only has support for Mandriva, RedHat and SUSE Linux.  What's even worse, only 32-bit versions of those are supported.  Today, 64-bit linux installations are on huge raise, and Ubuntu is getting more and more popular.  LabVIEW Linux support should be expanded to include Ubuntu, and 64-bit versions are needed.

 

cheers,

Pekko

 

I think it is very difficult to make a UI that runs on Windows and interacts with targets. Here are two suggestions to improve this:

 

1. We currently can't have \c\ format style in a file path control on windows. It would be nice to allow user to specify the OS syntax to use instead of assuming it should always be the local sytax.

2. The icing on the cake would be to have the File Path Control support a property node for an IP Address so when the user clicks on the browse button, it automatically browses the target (this is already an idea mentioned in the link below) and uses the syntax of the target. This becomes especially useful as we start to have targets that may have an alternative way of browsing files besides FTP. It would be a pain to figure out which SW is installed on the target and use the correct method to enumerate files/folders.

 

http://forums.ni.com/t5/LabVIEW-Idea-Exchange/Path-control-of-VI-under-real-time-target-should-browse-target-s/idi-p/1776212

 

These two features could be implemented by having an enum property node for the File Path Control called Syntax which include options like: Local, Various OSes (Windows, Linux, Mac, etc), or IP Address. If IP Address is specified, another property Node called IP Address will be used to determine what target's OS to use (if it's not specified or invalid, we default to Local).