07-22-2016 02:44 PM
Hi,
I have put together a fairly simple vi that plots data from incoming analogue channels and, when prompted to record data, it builds an arrary. When recording is finished the data is written to a file. I am using a local variable between while loops to append array data. Everything is working smoothly. However, when I increase my sample rate over a couple thousand S/s (>5000 S/s) my final array is not as large as expected. Any suggestions for optimizing this? I do not undestand where the bottle neck is. In the end I intend to be sampling 8 channels at at least 8,000 S/s. The vi is attached in case anybody wants to give something a shot. LabVIEW is pretty new to me.
Thanks,
Trevor
Solved! Go to Solution.
07-22-2016 02:56 PM - edited 07-22-2016 03:03 PM
Look for "producer/consumer" and how a queue can be used to transfer data between loops.
What is happening is the consuming loop is slowing down and not keeping up with data coming in. The local is getting over-written.
Put an indicator on your index terminals fo the two loops to see that one loop is getting ahead of the other.
Ben
And if you do not belive me just wait a couple of minutes and crossrulz will tell you the same thing.
07-22-2016 03:00 PM
The local variable is definately NOT the proper way to pass this data between loops. You should be using a queue since a queue is a lossless communication bus. You would also not need the wait in your second loop. You might want to have a good look at the Producer/Consumer Architecture.
07-22-2016 03:29 PM
Ben and Cross have hit the nail in the head, you will need a simple producer/consumer architecture to ensure no data is lost. Also a personal observation, you might benefit a lot from constantly storing data to a non-volatile medium, mainly because:
-Dynamically allocated arrays tend to slow down system as they grow due to a variety of reasons. Also, since they are stored in RAM, you might eventually run out of the stuff, at which point you will get a "Not enough memory to complete this operation" error, which will force-shut your application
-In the event of a blackout or program crash, data in RAM will be lost forever, which might be very detrimental if you're recording important info
What I'm doing in a datalogging application, is storing data in a .tdms periodically via the vanilla open > write > close algorithm, to ensure a minimal loss of data in a crash event.
07-25-2016 07:57 AM
Thanks for the reply. I have incroporated the queue and it appears to be working well. However, once I stop recording (stop enque data), my queue is massive and dequeuing each element takes forever. At this point, is the best technique to flush the queue and write the flushed data to a file?
07-25-2016 08:33 AM
The whole point of producer/consumer is having both loops working in parallel, so data is handled as soon (or close to) as it is collected, which means you are expected to enqueue and dequeue data at the same time. That way, when producer stops sending new elements to the queue, consumer will finish dequeuing whatever was left (if any) and patiently wait for new elements.
07-25-2016 08:45 AM
ok so then I'm assuming checking the status of the queue and my case structures inside the consumer loop must be slowing down the deque process
07-25-2016 08:51 AM
@tgrieco wrote:ok so then I'm assuming checking the status of the queue and my case structures inside the consumer loop must be slowing down the deque process
Probably yes, could you attach the code? Also do you need to check queue status for something in particular? Consumer loop will execute if it finds elements, and wait if it doesn't.
07-25-2016 09:02 AM
Ok that makes sense. Here is my code as of right now. I'm sure it's full of some rookie mistakes. First I'll get the dequeu out from behind any unecessary processing. Then I think I'm going to work on writing directly to a file instead of building an array. Thank you for your prompt responsnes.