You might have 1GB of memory available in your system, but the problem is that it is scattered in many different places. You might have 5MB here, 250kB there, and so on. Just like C and many other programming languages, when LabVIEW allocates memory for an array, that memory must all be contiguous. So when you try to directly read a 1GB file into an array, you don't just need 1GB of memory, you need 1GB of
contiguous memory! That's a really huge chunk and isn't likely to come by in a 32-bit operating system.
The best way to accomplish reading your file is to take the advice that the other poster gave, and read the file in smaller chunks. You could, for instance, read 1000 1MB chunks or 10000 100kB chunks. Then you can store the data for instance in a giant Queue, or perhaps a 1D array of clusters of 1D arrays.
Do you really need the entire 1GB file of data in memory at the same time? Can you not just extract a portion at a time?
Jarrod S.
National Instruments