10-19-2012 05:29 AM
Hey Greg,
Much clearer now. I really appreciate the response!
Thanks!
TheLT
10-20-2012 06:57 PM
@johnsold wrote:
Greg,
There definitely is a point where it becomes necessary to read in chunks. I recently was processing text-based spreadsheet files of >100MB and having both string and numeric arrays in memory at the same time created memory problems. By reading in chunks I was able to get the memory footprint to a few percent above the final numeric array size.
Lynn
Lynn,
We should chat! I recently did the same thing, hundreds of rows by hundreds of columns in CSV files. Luckily the computer was able to handle it all in memory at once, but I was really pushing the boundaries. If they expand the data to include additional columns, we may need to refactor. Unfortunately I was under a time crunch, and because there was enough memory and generally the files shouldn't change much, I had to take the quick way out.
10-21-2012 09:36 AM
Greg,
PM sent.
Lynn