LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Access files (ascii and TDMS) by blocks, step by step

Whener I perform a data acquisition and save the data I save to files (either ascii or TDMS, depending on acquisiton length, sampling rate, etc).

Afterwards, when I want to access the data, the only way I have managed to is by reading it all at a time. And so, I have to turn to other softwares such as MatLab for postprocessing. And also, if files are bige enough, I sometimes have memory related troubles when reading complete files.

 

I would like to know how to, because I am sure it can be done, how to do a loop for reading files by blocks, step by step; conrolling file position, block length, etc.

 

Regards,

usuario

0 Kudos
Message 1 of 9
(3,625 Views)

Hi

 

If your files are already written you just want to adapt the reading routine :

For ASCII files, the "read from text file" function has an option called "read lines", to activate it, right clic on the VI when it's on your block diagram, the icon changes and you can then set the number of lines you want to read.

 

For TDMS files, on the read function, you have "offset" and "count" input, use these to get just the chunk you want.

 

 

Hope this helps.


We have two ears and one mouth so that we can listen twice as much as we speak.

Epictetus

Antoine Chalons

Message 2 of 9
(3,618 Views)

 

I will definetely try this, didn't know about the "read lines" option. I see it clear how "read lines" can work in ascii files. But in TDMS files, how is the count done? That is, if I count, say, 20, in a tdms file with 2 channels, would I get the first 20 samples of channel 1 or would I get both first 10 samples of each channel?

 

Thank you.

0 Kudos
Message 3 of 9
(3,611 Views)

I think for TDMS you have to set the group and the channel(s) as inputs of the read function.

 

To get a better idea I suggest you download the TDM Excel Add-in Tool, then you can open your TDMS files directly in excel and what you see is that you have one sheet per group and inside of a sheet you have one column per channel of the group.

And the first sheet is the sum up of all the groups, and it lets you see the number of sample per group. Quite usefull.


We have two ears and one mouth so that we can listen twice as much as we speak.

Epictetus

Antoine Chalons

Message 4 of 9
(3,602 Views)

 

Thanks again, this is a useful tool.

 

But yet, when I record data in TDMS format is because I need to record a big amount of data (for example, sampling rates of 500kHz during 10 seconds, more than one channel), and often, it is more data than what excel columns can handle.

 

Any new proposals?

 

0 Kudos
Message 5 of 9
(3,594 Views)

Is there no option so that the tool can overcome that issue? I thought there would be but I haven't looked 😮

 

Anyway, if you have that much data you can code a reading routine that would read the max length your data (that's to be found into the meta data of the TDMS file) and then read the data by chunks (for instance chunks of 10k) in a for loop and display it in a graph.

 

That's if you want to keep TDMS file.

 

Another way would be to save you data to binary file and set a file size limit and create multiple files. To then have something like that :

File_00.dat       100Mb

File_01.dat       100Mb

File_02.dat       100Mb

File_03.dat       100Mb

File_04.dat       81Mb    (this is the last file).

 

There is some code to help you do that here.


We have two ears and one mouth so that we can listen twice as much as we speak.

Epictetus

Antoine Chalons

Message 6 of 9
(3,588 Views)

 

Dividing the file into a series during acquisition is a good idea too.

 

Thanks for your support; right now I don't have any acquisition running but will try these options next time for sure.

 

Regards.

0 Kudos
Message 7 of 9
(3,581 Views)

 

 

Anyway, for further readers of this wire, new suggestions are welcome.

 

 

0 Kudos
Message 8 of 9
(3,579 Views)

You may want to read Managing Large Data Sets in LabVIEW.  It walks through many of the issues you are facing here, although it is a bit dated.  After you read that, also read the LabVIEW help on the In Place Element Structure and the Data Value Reference.  Approach both with caution, since both break data flow (i.e. can be a potential race condition), but they help when dealing with copies of large data.

 

Text files can be read incrementally by either line or character.  Since the ideal block read size for a Windows file system is about 65,000 bytes, I like to read in my text files 65,000 bytes at a time and doing any parsing from the memory buffer.  When the buffer runs out, I get another 65,000 bytes until the file is done.  <shamelessAdvertising>This method will be the subject of my next post on object-oriented file I/O, which will, unfortunately, be about two to three weeks from now, due to my "real job" getting in the way Smiley Tongue.</shamelessAdvertising>.  If you are in a hurry and enjoy complex code, the Read From Measurement File Express VI in LVM mode also uses this method.  If you open the front panel of one and dive into the code, you can see it in action.

 

Good luck!  Let us know if you need more help.

0 Kudos
Message 9 of 9
(3,559 Views)