12-21-2009 05:06 PM
Hello. NI/labview newbie, so apologies right off the bat for what are probably simple questions. I am using labview 9.0 and a M-series 6229 board. I have set up a program that drives a servo with a PWM signal . . . leaning heavily on the examples that are out there. The example I selected uses the 6229 counter to drive the PWM signal. Now I am trying to update the code so that PWM signal is modulated by a sinusoid with the user inputs being the freq and amplitude of the function. Basically I want the servo output to track this sinusoid. The problem I am having is coming up with a way to pull the time elapsed from the 6229 counter to use as an input to the driving sinusoid. I set up a loop in labview with a specified delay to update the sinusoid but since I am trying to update every 20ms I think I might be hitting software timing limitations? At least the output of the servo using this timed loop is erratic. So I'm thinking that using the hardware timer is a more robust solution but I cannot figure out a way to do this. Any ideas/examples?
Please let me know if I need to clarify.
Thanks,
Chris
12-22-2009 02:44 PM
I've updated the code so that the timed loop uses the board counter . . . but the output is still unreliable. Ulitmately I will be running this through labview RT but are my loop times so fast that I would be running in to an issue (20ms)? It doesn't seem like that would be the case. Any ideas? I've attached the code, I'm hoping there is a glaring error.
Also, if you think this post should be moved elsewhere to get more appropriate traffic, please let me know.
Thanks,
Chris
12-22-2009 03:39 PM
HI Chris,
I'm a little unclear on the Digital Task in your atteached VI. Is the following example relative o what you are trying to do.
DAQmx : Pulse Train Generation with Changing Pulse Specs (PWM) -- LabVIEW, C#.NET
12-22-2009 04:22 PM
That is kind of what I am trying to do. In the program you attached it looks like they are modulating the on board counter itself to alter the freq and width of the signal. I want to try to move away from altering the counter itself since I hope to drive more than one servo from one counter. This is the example I used extensively:
http://zone.ni.com/devzone/cda/epd/p/id/5043
The counter is used as a basis for a sample clock, a 1000 sample array is generated based off the desired pulse width and the sample clock iterates such that the program steps through each of the 1000 points at whatever frequency is specified (at least I think that is a rough idea how the above example works). I am more or less doing the same thing, but instead of having a user update the pulse width, I want it to be updated by a cosine function. This cosine function is a funciton of time, so I need to be able to pull out an elapsed time while the program is running. Thus the timed loop which is also run off the board's counter. So, what I am hoping is happening in my program is that every 20ms a new array is generated based on the iteration number multipled by 20ms, the user specified freq, the user specified amplitude, and the cosine function. This array is then passed to the digital output task which is running 1000 times faster so that it can jump through each of the samples in the 1000-sample pulse width array.
Something is up with the timing or something because the output is erratic. But otherwise it does what I want. Any ideas how to solve the problem? Or what I might be missing?
Thanks,
Chris
12-23-2009 04:37 PM
You mentioned that you're going to eventually be running this on an RT system- have you had the chance to test this yet? I've taken a look at your code, and I think that the problem you're seeing is likely related to the fact that you're using a timed loop on a machine that is not RT. There's a solid chance that since your loop timing is not purely deterministic, that's why you're getting erratic readings. As well, I'm curious what you're expected output is- how are you witnessing your results? Thanks!
12-24-2009 02:32 PM
Haven't tested it yet, been running it off a PXI running labview through windows and found out that in order to install labview RT I am going to have to reformat to get the fat32 RT partition before the windows partition . . . ugh. My fault, should have looked up install instructions before reverting back to XP from Vista.
It occured to me that my issues could be related to running the code non-RT but it didn't seem like my loop times were that fast. My expected output is the servo horn moving the specified amplitude in a sinusoidal fashion . . . moving slower through the peaks and faster through the zero crossing (due to the size of the steps prescribed by the cosine function/time step). I recognize that this might result in some jitterness since at the peaks the servo might be able to complete the step before getting the next position command from the code but the servo output currently also changes frequency from one period to the next. I am witnessing my results by watching the servo horn move but also looking at the PWM signal through an oscilliscope. I can see the PWM sweep change freq on the oscilliscope in agreement with the servo rotation from period to period, so it does not seem to be an issue with the servo itself.
Hopefully I will get RT functioning next week and I will report back with the details. If anyone has any other ideas though I am interested in hearing them.
Happy Holidays, and thanks for the help.
-Chris