10-04-2019 01:53 PM
I'm trying to create a labview program so that my daq can output 6 utility bits and then acquire 1 analog signal (AI0) while these bits are on. I am reading the data through a csv file. The data is a matrix of 6 columns (bits) and each row is a separate input. I'm trying to read the output analog signal for each input.
I'm new to labview and having difficulty with timing. I can send in my data fine. I just don't quite understand how to set up my model so that once the input is set, only then will the output write to file for each iteration.
I have attached what I've mocked up so far. I apologize, it is quite messy, I was trying different methods (DAQ assistant to mxDAQ).
How do I set the timing from when i enter the bits through daq to reading the output
10-04-2019 02:37 PM - edited 10-04-2019 02:38 PM
1. You really should learn to not use the DAQ Assistant. You will get along a lot better that way.
2. Your Digital Output task and Analog Input tasks need to stay separate.
3. Initialize your tasks before the loop and close them afterwards. You can do the writes and reads inside the loop.
4. You really should also look into not using the Write To Spreadsheet File. You are constantly opening and closing the file with each iteration of your loop. This is extremely expensive. Open the file before the loop, close it after the loop, and use Write Text File (along with Format String or Array To Spreadsheet String) inside the loop to write to the file.
Here is a very quick clean up of your code. Far from what I would put out, but enough to get you started.
10-07-2019 12:30 PM
Thanks for your reply. This was very helpful.