08-16-2016 11:13 AM
It can live on the disk the same way we can watch a video that contains more info that our machine memory.
I suspect if we watch the disk I/O while scrolling through the data we would see the bump in I/O will come and go as we scrool through the data. (Just check using resource Monitor>>>Disk)
Ben
08-16-2016 11:24 AM
I would be very surprised if a VI only loads partially initially. I guess we have to wait for a blue guy to give the final word. 😄
08-16-2016 11:33 AM - edited 08-16-2016 11:34 AM
The VI with the constant not being used translates into a virtual NOOP as far as the code goes. The THINGY that looks like an array constant can be just a visual placeholder to which we can attach nodes and once we do so the data becomes part of the as was illustrated by connecting an indicator to the array moved the data into the code.
Also not that the errors reported from trying larger VI impacted the file of the VI and pushed it to the limit.
I betcha Rolf would know (smiley-wink)
Ben
08-16-2016 12:22 PM - edited 08-16-2016 12:36 PM
File sizes don't have a 32bit limit in general (maybe vi files do?). I wonder if the array size limits would be different on 64bit LabVIEW...
08-17-2016 05:55 PM
I didn't have time to read the myriad speculations as to this inconsistency, but here is why it is what it is:
The code that measures the size of the "Block Diagram Objects" is ignoring the "source code" copy of the constant array's value. This could rightly be called a bug, but I suspect the severity of it (which I personally rate as "low") makes it unlikely to be fixed. The mechanism that measures it is stupidly elegantly simple right now, and to make it correct would likely be more work than it is worth.
The "Data" measurement is accurate, and is small in the unmodified VI because the value of the constant is not used, and its existence is removed as "dead code" from the compiled VI's data space.
If you wire the constant to an output, you will see that the "Data" measurement goes up to a big number approximately equal to 500,000 bytes (250 rows * 250 cols * 8 bytes per double), or 489.2 K (where K=1024) bytes.
BTW, the "On Disk" measurement of this VI is smaller than the "Data" size because of compression.
08-18-2016 08:58 AM
@robdye wrote:I didn't have time to read the myriad speculations as to this inconsistency, but here is why it is what it is:
The code that measures the size of the "Block Diagram Objects" is ignoring the "source code" copy of the constant array's value. This could rightly be called a bug, but I suspect the severity of it (which I personally rate as "low") makes it unlikely to be fixed. The mechanism that measures it is
stupidlyelegantly simple right now, and to make it correct would likely be more work than it is worth.
The "Data" measurement is accurate, and is small in the unmodified VI because the value of the constant is not used, and its existence is removed as "dead code" from the compiled VI's data space.
If you wire the constant to an output, you will see that the "Data" measurement goes up to a big number approximately equal to 500,000 bytes (250 rows * 250 cols * 8 bytes per double), or 489.2 K (where K=1024) bytes.
BTW, the "On Disk" measurement of this VI is smaller than the "Data" size because of compression.
Thank you Rob!
Christain,
Can we call that a solution to your origianl question?
Ben
08-18-2016 09:03 AM
Ben wrote:Can we call that a solution to your origianl question?
Just did! 🙂
It is always appreciated if somebody featured on the LabVIEW 1 splash screen gives the final answer. 😄 Thank you so much!
08-18-2016 09:08 AM
Thank you!
Now how about looking at post number 4, 9 and 13...
Hmmmmm?
Ben