03-30-2009 03:57 PM
I am trying to read in a binary file that is about 300 MB. At the read file vi, I get an error message popping up saying that I do not have enough memory to complete the operation. The system has 2 GB memory. Files that are 150MB read in perfectly fine, with no system lag. Monitoring physical memory usage, hardly any is being used. Does LabVIEW not recognize the correct amount of memory in my system, or is there some other problem causing me to not be able to read in a large file?
Thanks,
JR
03-30-2009 04:21 PM
Hi JR,
LabView needs continuous chunks of memory for arrays. This may be the cause for such errors...
You may:
- work on parts of the file
or:
- init an array the size of the file, then load smaller parts of the file in a loop and replace array subsets
03-30-2009 04:25 PM
Im still sort of surprised that the memory error happens between 150 and 300 Mb. LabVIEW is supposed to be able to read in 2 gig through File I/O with no problems.
03-30-2009 04:50 PM
Hi JRaetz,
What kind of binary file and how are you reading it?
Watch out for data copies using the Tools>Profile>Buffer Allocation tools. Three hundred MB gets of data becomes very large if it is copied a couple of times.
This link might help also: http://zone.ni.com/devzone/cda/tut/p/id/3625
Please post a block diagram image or better still the VIs that you think are causing the problem.
Regards
Ruben
03-31-2009 08:35 AM
It is a 10 by 8,000,000 2D array of singles, written to a binary file using Write to Binary File, and read back using the related Read from Binary File from the File I/O pallet. The file size comes out to be10*201*200*200*4bytes/sgl = 314,062 KB.
If it was a problem with not having that size of continuous memory, wouldn't it attempt to reorganize memory or use virtual memory?
03-31-2009 08:54 AM
Hi JR,
I said "maybe" - but that's the most common source of problems when handling (really) big arrays.
Still my advice would be to load smaller parts of the file and use them to replace data in a pre-allocated array - if you really need that much data at once in memory.
03-31-2009 09:48 AM
Gerd-
Can you tell me how I can read part of a binary file instead of the entire thing? It's definitely an option worth trying, but I can't figure out how to do it.
Thanks,
JR
03-31-2009 10:35 AM - edited 03-31-2009 10:37 AM
Hi JR,
did you notice the context help window? At least for LV7.1 it will provide the neccessary information:
Do you notice the inputs for number of rows/columns? And the read offset, which you have to wire when reading chunks from a file?
03-31-2009 01:14 PM
03-31-2009 02:21 PM
Thanks to everyone responding and trying to help.
I am using LabVIEW 8.6, and I cannot find a Read from SGL File block. I have found through subsequent testing that I can read the file in in 3 separate parts, but only once. After running the file to read it into 3 sub-arrays, if I try to run it again it will produce a memory error again. Shutting down and restarting LabVIEW fixes it again. Monitoring my memory usage, when I can succesfully read into 3 arrays, I jump from ~400MB of used memory to just under 1 GB. The second time, when reading the file does not work, I'm up to 1.2 GB used, of my 2GB available. I don't know if any of that is helpful or not, but I thought I'd add it.
What do you mean by taking a dive into the code for the block? I'm a C programmer who started using LabVIEW 3 months ago, so if I could see what was going on behind the scenes it would help, but I don't know how to do that.
Thanks again,
JR