LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Memory usage in a while loop with an xy graph

Solved!
Go to solution
Solution
Accepted by topic author stephencox

Opening and closing the file every loop won't build up memory, but as the file gets larger, that specific code will take longer. It's really just not a best practice. Up to you.

 

If you double-click the Write Delimited Spreadsheet VI, you can see that it is made up of LabVIEW primitives. If you use those primitives to create your own version of the File I/O, you will end up with a file refnum. You can use the Advanced File I/O palette to get the file size and then just close the file and re-open a new one. There's an awesome new function in LV2015 called Create File WIth Incremented Suffix.

Cheers


--------,       Unofficial Forum Rules and Guidelines                                           ,--------

          '---   >The shortest distance between two nodes is a straight wire>   ---'


Message 11 of 18
(2,197 Views)

Thanks, that's a good suggestion.  I will implement that and hope in the meantime that switching from xy graph to xy chart fixes my multi-day performance problem.  If not, I can try to decouple logging and plotting, although I would rather understand the problem rather than just changing things until it disappears mysteriously.

0 Kudos
Message 12 of 18
(2,191 Views)

If the XY Graph is the issue, then it's the memory associated with the array that you're writing to the graph. If you downsample that array (because you probably don't need 2Hz data across multiple days on a graph) by say taking every 10th point, your array will be a 10th the size and the graph won't change much.

 

If you really need all that data for the user to see when they zoom in, you can do some fancy stuff with old datafiles to load old data when they zoom in. In the end it's a compromise between how much time you want to spend to make the code smart enough for the user to have an awesome experience. 

 

If you would, please look back through the thread and pick the answer the best matches your question and Mark as Solution. This helps other find what they need in the future.

Cheers


--------,       Unofficial Forum Rules and Guidelines                                           ,--------

          '---   >The shortest distance between two nodes is a straight wire>   ---'


0 Kudos
Message 13 of 18
(2,185 Views)

Well, my array is never more than 1000 points.  So if it is the xy graph, it is caching data in a way that I don't understand.

0 Kudos
Message 14 of 18
(2,172 Views)
Nope, you should be fine… You can go back to using a graph if you want if you're maxing out at 1000 points. The problem was most likely just the File I/O.

Cheers


--------,       Unofficial Forum Rules and Guidelines                                           ,--------

          '---   >The shortest distance between two nodes is a straight wire>   ---'


0 Kudos
Message 15 of 18
(2,164 Views)

@James.M wrote:

Opening and closing the file every loop won't build up memory, but as the file gets larger, that specific code will take longer. It's really just not a best practice. Up to you.

 

If you double-click the Write Delimited Spreadsheet VI, you can see that it is made up of LabVIEW primitives. If you use those primitives to create your own version of the File I/O, you will end up with a file refnum. You can use the Advanced File I/O palette to get the file size and then just close the file and re-open a new one. There's an awesome new function in LV2015 called Create File WIth Incremented Suffix.


An consider changing the file type to TDMS.  They are much easier to work with for these types of systems since they allow you to display the existing data without stopping writes to the file at the same time.  Hence you only have to open a ref to the file once rather than for every "Write operation"  There are good shipping examples and white papers on the advantages.


"Should be" isn't "Is" -Jay
0 Kudos
Message 16 of 18
(2,146 Views)

You are using a chart, not an xy graph, so simply set the chart history to 1000 and get rid of all the array operations. (If you would be using an xy graph, you should be using 1D complex array instead of cluster arrays. Make things simpler. You would also initialze a fixed size array and replace the oldest point with each iteration and use a plot style without interpolation so the order of the data in the array does not matter)

 

You have several race conditions due to the overuse of local variables. E.g.  the "Cryo setpoint" local will get read before the local inside the event gets written. Why don't you use the actual terminal? It sits all outside, completely bored. You are also creating confusion between controls and indicators.

You have may to many different event conditions for the send button. Whenever you press and release it, you are triggering four events

(button is switch until released, so you get a value changed event for the on and one for the off transition. Then you also get events on mouse up and mouse down. That's way too much. Make the button "latch when released" and only keep the value change event condition. Now the event will only fire exactly once. However, since you are spinning the loop anyway, get rid of the event structure and simply poll the boolean.

 

Here's how the core code could look like. Keep it simple!

 

 

 

 

 

 

0 Kudos
Message 17 of 18
(2,139 Views)

Ok, I broke the big vi up into several smaller pieces, which is more optimal anyway.  They run either at different frequencies or on demand to serve different needs.  It quickly became clear that the cause of the problems was not what I thought, but was rather the calls to a low-level vi (by reference) on an older machine running Labview 8.6 and doing nothing but talk to some old hardware.

 

I noticed that sometimes after the vi ran for many days and developed the problem of moving slowly and taking forever to stop, it would also cause Labview 8.6 on the other machine to crash when I did stop it.  I dug into the vi reference calls and found one that was missing a "close reference."  I won't be sure for a few more days, but I bet that this is what was causing the problems.

0 Kudos
Message 18 of 18
(2,058 Views)