LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to determine binary file data set size

Solved!
Go to solution
Solution
Accepted by dacad

no that would not be simple.

 

Data bases use an index file to find records of interest without having to read the entire file. If you had another file that is written everytime you write more data to the file, it will contain the indexes that you will need.

 

If you wrote to the file 50 times it will have 50 index values where "index value" is the file postion.

 

I do not think I can find anyhting fast to back me up but the files size is not the same thing as the the file marker. Your file can be larger than the amount of data it contains. File size speaks to the maount of space that is allocated for the file on disk.

 

Rather than use the file size set the marker to the end (offset 0 relative to the end) and then find the marker value relative to the begining. look at the hlpe for "Set File Size"

 

and while I am at it...

 

I think if you open the file as a data log file you can use a datalog file read.

 

Also double check that empty array as Christian posted.

 

LV has some weired definitions of an empty array that could be part of the answer.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 11 of 22
(1,623 Views)

I added in all the proper open and close for both the write and read. I know that is the correct way to do it but I had tried that earlier, it had not seemed to make any difference. I went through it again however.  Even with a fresh file open and set position, for both the write and read vi's, I still get an error if I feed the refnum into the Read Binary.

 

I set the Endian thing to Little though I really have no clue if that has an effect. Doesn't seem to matter either way no matter what else is going on.

 

Starting to struggle........

 

Doug

Doug

"My only wish is that I am capable of learning each and every day until my last breath."
0 Kudos
Message 12 of 22
(1,617 Views)

Didn't even think about the typdef(s)

 

My bad

Doug

"My only wish is that I am capable of learning each and every day until my last breath."
0 Kudos
Message 13 of 22
(1,616 Views)

Datalog file.......   That may hold promise.   Will be on that first thing in the morning as after 9 straight hours, my eyes are fuzzy.

 

will report back with results

 

Thanks....    Doug

Doug

"My only wish is that I am capable of learning each and every day until my last breath."
0 Kudos
Message 14 of 22
(1,615 Views)

Whoo Hoo, The datalogging file format works flawlessly.   Seems like it would be alittle higher in the menu tree.  I usually take it that the deeper in the tree you go, the more complex or specialized the functions.  I would expect to see this datalogging menu as a selection under the top level file I/O menu but that's just me.

 

End result, I'm off to the next task.

 

Thanks for all the feedback on this.

 

Doug

Doug

"My only wish is that I am capable of learning each and every day until my last breath."
0 Kudos
Message 15 of 22
(1,596 Views)

@dacad wrote:

Whoo Hoo, The datalogging file format works flawlessly.   Seems like it would be alittle higher in the menu tree.  I usually take it that the deeper in the tree you go, the more complex or specialized the functions.  I would expect to see this datalogging menu as a selection under the top level file I/O menu but that's just me.

 

End result, I'm off to the next task.

 

Thanks for all the feedback on this.

 

Doug


Pssttt....

 

Just between me and you.

 

NI and LV started out on a Mac and like Mac push all of that dirty techno stuff down and out of sight in the hopes they will not scare off people.

 

But that point aside I wrote years ago something to the effect;

 

"Learning LV is like wondering in a beutiful wood, with mysteries and wonders hidning under ever rock."

 

Get into the habit of flipping over those rocks and poking the old logs. You will be suprised what comes crawling out.

 

Thanks for the update!

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 16 of 22
(1,589 Views)

Actually the datalog interface to the File I/O functions is in fact a relict from very early days of LabVIEW. While it may seem to work for your specific case it is quite troublesome, limited and rather uneasy to use. For one thing it does not support random access to the records. If you wan't to read the last records in a file, LabVIEW always has to seek through the entire file, since the stored dataformat does not have fixed sized recoreds. This can get a VERY slow solution once your files accumulate a lot of records. If the LabVIEW developers had a say it had gone several versions ago, but once introduced and documented functionality is quite holy in LabVIEW and can't just be removed, unless it causes more troubles than dropping it would.

 

Having looked a little more closely at your VI, and cleaning it up and all.

 

While the specific data set you have as default data in the cluster produces always exactly 6 numbers for each of the 4 arrays (and therefore will produce 52 (6 * 8 + 4) bytes per array, have you been sure that you always passed data sets to this VI that produced the same amount of elements per array? The 8 bytes difference you see is exactly one double precision floating point value. It would seem not difficult to pass data to this VI that does not produce fixed size array blocks.

 

With the cleaned up VI and repeatedly calling the VI with the default data, I had a steady and constant increase of 52 bytes for every file and each execution.

Rolf Kalbermatter
My Blog
0 Kudos
Message 17 of 22
(1,585 Views)

@Ben wrote:

Also double check that empty array as Christian posted.

 



Empty arrays written to disk as binary stream are n * 4 bytes long, with n being the number of dimensions of the array. And those 4 bytes are an int32 that is set to the number of elements in the array, which doesn't need to be 0 for multidimensional arrays. A 4*0 2D array is fully legal in LabVIEW resulting in 0 data elements but still a size of 4 for the first dimension. So physically it is empty, but logically it isn't.

Rolf Kalbermatter
My Blog
Message 18 of 22
(1,579 Views)

I will take a little time to analyze more closely exactly how much data goes into the write each time.  Though I had set it up to write the exact same number of values each time, I did see something that made me wonder if I was getting a different size group of data somewhere along the way.

 

At this point, the datalog approach is working well and meets my needs for this application but I want to be able to understand why the other approach didn't work as it should have for future applications.

 

File size is something I will need to monitor and if it becomes too cumbersome, I will have to address that eventually.

 

Again, thanks for all the feedback.

 

Doug

Doug

"My only wish is that I am capable of learning each and every day until my last breath."
0 Kudos
Message 19 of 22
(1,572 Views)

@rolfk wrote:

@Ben wrote:

Also double check that empty array as Christian posted.

 



Empty arrays written to disk as binary stream are n * 4 bytes long, with n being the number of dimensions of the array. And those 4 bytes are an int32 that is set to the number of elements in the array, which doesn't need to be 0 for multidimensional arrays. A 4*0 2D array is fully legal in LabVIEW resulting in 0 data elements but still a size of 4 for the first dimension. So physically it is empty, but logically it isn't.


 

So too calls with an empty array could explain the difference between 44 and 52.

 

If decyphering files didn't tkae so long, I'd try to confirm with the posted data files, but my coffee is n't fully engaged yet so I will pass.

 

Ben

 

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 20 of 22
(1,570 Views)