08-18-2008 10:31 AM
I have a group of analog sensors that I'm sampling. Right now, we have a hard-wired app that uses MAX to set up our 32 AI channels and associate custom scales to the appropriate channels. I'm trying to make our app a little more easily configurable by reading a configuration file in the host VI that sets up everything. I was assuming that I'd be able to configure a custom scale per channel using the channel property node. But I can't figure out how to do this AND still have a single task (because they are all still sampled at the same rate and published to a single SV). I'm starting with an array of physical channel name strings (from the file) and feeding them into "Flatten Channel String" and then calling "Create Channel" which outputs a task. How do I get a reference to the individual channels?
Thanks,
Keith.
08-18-2008 11:25 AM - edited 08-18-2008 11:26 AM
try something like this. Use a for loop to generate all your channels with a seperate call to create channel. You can then define a custom scale for each channel and they are all part of the same task.
08-18-2008 11:28 AM - edited 08-18-2008 11:29 AM
Try something like in the attached picture, it should allow you to generate/assign scaling on a per channel basis.
08-18-2008 12:29 PM
Aha! I hadn't realized I could call create channel multiple times for a task. Thanks!
But the problem now is that it's not applying the scale. I want to keep the scales in MAX because they are tied to the serial numbers of the devices, so I'm giving a string like you suggest rather than create the scale programmatically. I can tell it's using the scale string because if I give it a bogus string, it gives an error. I'm using a simulated device and one channel with a scale with 10 for the slope and another channel with no scale (empty scale string), but the values I get back for the scaled channel are not an order of magnitude bigger.
08-18-2008 01:02 PM
You could use global channels, these can be selected within a task and given a channel.
Ton
08-18-2008 01:12 PM
What the heck? I've tried setting the scale in my VI like nrp showed, but I get the same results: no errors but no scaling either.
Is there any reason custom scales would not work with simulated channels? That would be extremely surprising to me ...
08-18-2008 02:41 PM
OK, I can't get a scale to work in MAX either. What am I doing wrong? I created a new simulated device and a task for analog voltage inputs. I set one of the channels to use my scale and hit the run button (i.e., I'm not even using a VI now). I still see no scale applied!
08-18-2008 04:35 PM
knicewar,
Simulated devices are not really intended to completely mimic a real world DAQ system, my understanding is they are there to get you started if you don't have the hardware present. A simulated device will have many restrictions compared to a physical DAQ, and from my testing it would appear that custom scaling does not apply either.
I think simulated devices are more used to make sure your task is set up correctly in terms of timing and other typical development errors.
08-18-2008 10:59 PM
<rant>
Waiting to run on the real HW is too late. I can't believe something that claims to be a rapid data aquisition system development tool is so utterly devoid of any significant simulation support! Granted, I'm new to LabView, but I'm appalled. It seems like they had to go out of their way to PREVENT custom scales from working with simulated signals. I mean, at the surface, MAX seems like a nicely layered architecture, so the upper layers (above the raw unscaled data) should have no clue they are working with simulated data. Obviously, things are not as nicely decoupled underneath as they appear ...
Here's what I expected from LabView: I should be able to write a VI that is the simulation of my system. This should plug into the "back end" of a simulated NI-DAQmx device (so the inputs of the VI are the outputs of the device and vice versa). Then I'd be able to validate my app with a fully functional simulation. Further, after running with real-world data, we can compare with the simulation model and make updates. Then we could do what-ifs to see what should happen the next run.
The fact that I can't do something as simple as this makes me think that LabView is the wrong tool for the job.
</rant>
08-19-2008 06:29 AM
I can understand your frustration. If you have been working under the assumption that LV is a rapid prototyping environment where you can design software in minutes I am afraid you are mistaken. LabVIEW is a fully fledged programming language and takes many years to master.
Once you know most of the "secrets" (and this only comes from experience), most things are pretty easy to accomplish in LV.
For example, if you need to simulata data that is similar to what you expect in the real world, forget about using simulated devices in MAX. Rather take a few minutes and code a VI which can generate data in the form you expect, place this VI where you would normally do an AI Read (dont forget to simulate the acquisition delay) and presto, simulated data!
For sure LabVIEW is the wrong tool for some jobs, but in the right hands, it is a truely powerful tool that will knock your socks off and can be used to solve virtually any real-world data acquisition (and much more) problem.
May I ask how long you have been using LV for?