LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Allocating enough memory to open a large data file

Sorry, could you attach a simplified version of your VI, because many things are not quite clear.

 

What exactly are you doing when the file ends up with all zeroes. It is possible that you run into this known bug.

 

You need to correctly arrange the binary data so they seemlessly mesh into one big dataset inside the file. Remember, a file is just a linear series of bytes.

 

Instead of a pair of DBLs (2xN) array, you could merge each pair into a complex DBL with x=RE and Y=IM and then deal with a 1D array of CDB instead.

 

Why are you still using 64bits (DBL) if your data is only 18bits?

 

How exactly are you retrieving the data?

0 Kudos
Message 11 of 33
(1,254 Views)

Hello,  You can use the "initialize array.vi" to pre-allocate memory for a data aquisition, and instead ofusing the "build array.vi" to add data to an array, use the "replace array subset.vi" to insert the data into the array.

 

As for reading large files, as mentioned above, you can use the basic open file, read bytes from file, and close file to read in portions of a large data file INSTEAD of the "read spread sheet file.vi"  If you open the read "spread sheet file.vi" you can see how the sub-vi's are called.

 

Good Luck

 

 

0 Kudos
Message 12 of 33
(1,249 Views)

For some reason I had thought that it was doubles which were 32 bytes with 23 bits of precision, but it looks like it's single precision floats that are. I will change those doubles to floats, then. 

 

I'm not getting a file filled with zeros, I'm just getting a file with nothing in it except the prepend data. What I'm trying to do is initialize a 2x50M array full of 0s and replace the zeros with my data as it comes in. I'm doing this because trying to simply append the new data to an existing array is too slow. What ends up happening though is that it tells me the memory is full, and uses an empty array instead, so when the vi finishes it writes an empty array (ie nothing) to the file. Not zeros, just nothing.

 

It seems to work fine for 2x15M but not 2x20M or higher. 2x50M of floats is only 800MB though so I still don't see why I should be having a problem, even with that many, let alone 2x20M which is only 200MB.

 

I've attached the VI. The subVI in there is just to remove the excess zeros from the end of the array, since I'm initializing it to a size slightly bigger than I'll need. This isn't why nothing is getting written to my file, though: I've checked and the array is already empty before it goes through the stripper.

0 Kudos
Message 13 of 33
(1,247 Views)

 Because you are using a constant array you are unwillingly doubling the memory usage (see here).

If you want to do something like this usethe following construct:

 

The same construct should be used when reading files.

 

Ton

Message Edited by TonP on 11-11-2008 10:51 PM
Free Code Capture Tool! Version 2.1.3 with comments, web-upload, back-save and snippets!
Nederlandse LabVIEW user groep www.lvug.nl
My LabVIEW Ideas

LabVIEW, programming like it should be!
0 Kudos
Message 14 of 33
(1,242 Views)

You should write the transposed array instead, this way the data of successive write operations mesh nicely into one contiguous dataset.

 

Here's a very (very!) rough draft (8.6).

 

Of course there is nothing wrong of appending directly to the file as in the above example. 

 

0 Kudos
Message 15 of 33
(1,235 Views)

I see. I had thought that doing something like that would be too slow, but I tried out your suggestion and it seems to be fast enough.

I'm having a couple of problems now though when reading it. I haven't used binary files before in labview, so the below is what I think I need to do to get the data out. The problem is that again because I'm building an array, it goes unbearably slowly. Also for some reason, there are a ton of completely empty arrays (several empty arrays for each one that actually has data in it). Is there a more efficient way than this to get my data out?

0 Kudos
Message 16 of 33
(1,227 Views)

Read it as a plain DBL and reshape to two colums (or simply use "decimate" for simple cases).

 

 

Also, when you write it, you need to wire a FALSE to "prepend array size" This way the data in the file remains flat and you can read it with other boundaries later.

Message Edited by altenbach on 11-11-2008 02:56 PM
0 Kudos
Message 17 of 33
(1,223 Views)
Thank you, that did work. Now I'm getting memory errors when I try to read it again. I just did a scan with the new digitization vi, the file ended up being just under 600MB, and even using read binary, it told me there was not enough memory to open it. The vi I'm using to open and analyze the file works when I collect a smaller set of data, but not with a full set. Even when I use single precision floats, halving the data file to just under 300MB, it still won't open. With 3GB of memory, this shouldn't be a problem. Why is it?
0 Kudos
Message 18 of 33
(1,208 Views)

BrockB wrote:
under 300MB, it still won't open. With 3GB of memory, this shouldn't be a problem. Why is it?

In every programming language an array is place in undefragmented piece of memory. Your system just doesn't have 300 MB available, relying on reading the file at once is very dangerous, what if you want to open the file on a system with 1 GB?

 

Sorry about the transpose, it has been a long time I have used plain datastorage.

My code for writing the file can be used as well for reading the file.

 

Ton

Free Code Capture Tool! Version 2.1.3 with comments, web-upload, back-save and snippets!
Nederlandse LabVIEW user groep www.lvug.nl
My LabVIEW Ideas

LabVIEW, programming like it should be!
0 Kudos
Message 19 of 33
(1,184 Views)
I don't need to open this file on another system. I do need to perform a FFT on the data though, and that is one thing that I need *all* of the data to do, it cannot be done in pieces. Again, there must be a way to open a 300MB file on a system with 3GB of RAM. How can I do it?
0 Kudos
Message 20 of 33
(1,165 Views)