10-13-2009 11:24 AM
hello
i'm using labview 8.6
i'm trying to make a code that will allow me to record an analog output channel (of DAQ 6259).
my problem is that i need to record the channel alternately (for example record for 30msec stop for 200msec record for 10msec stop for 50msec and so on), and i cant have any delays between my record/stop command and the execution of the command (at least not from the recorded data point of view).
i tried to enter the data to an array inside a while loop and my while loop is inside a case that choose if it's record time or pause time.
the problem is that if i do that the loop control is not that efficient and the data is not sequential.
does anyone have any idea how can i do this correct?
10-13-2009 11:38 AM
What determines when the start and stop commands occur? How does the software get this information? The times you suggest are much too fast for a human user to generate. Are they predefined or on an analog or digital input channel?
In any case I suggest that you set up to acquire the data continuously and then extract the sections you want to record. A Producer/Consumer architecture might be appropriate.
Lynn
10-13-2009 01:15 PM
You could also have a 2-sequence inside a loop, with the time (in ms) determined by you entering numbers into 2 different controls. You could also control how many iterations using external controls, which will affect how long the entire data gathering section is viewed.
Please be a little more clear on what you actually want to do.
10-13-2009 01:19 PM - edited 10-13-2009 01:19 PM
10-13-2009 02:42 PM
Along John's line - could you give more information, because without the details it is hard for anyone to provide a good recommendation. If you data acquisition is triggered off a user event, I would be hard pressed to believe that you will ever get the kind of performance that you are looking for. But, if the acquisition is based off of a timing diagram, I think that there might be ways to efficiently collect the data at the intervals you are seeking.
Matt
10-14-2009 01:46 AM
i'll try to be more specific:
i have an algorithm in matlab i suppose to run in labview (with math script).
the output of the matlab is int 16 [] is suppose to transmit by analog output of my DAQ.
i also need to record this output.
the system that read the output works in millisecondes resolution.
between one iteration of the algorithem on another iteration i have a 200msec setup time that i dont want in my record.
after i run this test i want to be able to run my record to the system without the setup time, that means i need the data to be sequential and not to have even 1msec delay between one iteration and the other.
thank you for your help
10-14-2009 09:40 AM