LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Storing Data

RL wrote:

> Thanks Rolf,
>
> The frequency was included only to give information regarding speed.
> The measurements are rather short and consist of 500k to 1M points
> with only about 10 to 20 measurements/day.
>
> Am I back into database territory?

Well, it could be done then, but I think it is adding expensive and
resource hungry infrastructure which was not meant to handle that
problem and does not give you a real advantage.

Why not just save the data into a binary file instead? Make one new file
for each measurement, structure the data in directories according to
day, project or whatever your chararacteristics are. You could even go
as far as creating the actual binary file and another tex
t file with a
differnt file ending, in which you store all the characteristics of the
measurement, such as data/time, scaling, customer etc, and even the
weather you like.

The VIs you would have to write for such a functionality wouldn't be
really more difficult than if you need to figure out how to program your
database. Also you really need to go with a real database if you want to
save 1MB data into binary large objects as Access just won't do it, as
well as the database would grow soon over any Access size limit.

Rolf Kalbermatter
Rolf Kalbermatter
My Blog
0 Kudos
Message 11 of 20
(2,407 Views)
Hi

I agree with Rolf.
We have some sort of text executive software here. I save the script parameters in an INI files, and all the heavy data (files from a few kb up to 100 MB) are saved in a subfolder as binary files (and linked in the ini file). Every new test is created in its own directory with the directory carrying the time stamp of the test. It is quite flexible and easy to implement. The only draw back I can see in the future will be looking up a particular test. But this is not a big problem as it is possible implement a search by date if require.

Note: I do not have experience with database, so I can not compare whether or not this is a better approach.

PJM


  


vipm.io | jki.net

0 Kudos
Message 12 of 20
(2,407 Views)
Thank you,

You are both helpful and I am getting more perspective than from the several telephone calls I made. I discounted the Binary format because of the Descriptive String. Would it be a can of worms to save the Time Stamp and Descriptive String to a spreadsheet (or appropriate searchable format) after the data has been saved to a binary file. The Time Stamp would be the common link.
0 Kudos
Message 13 of 20
(2,407 Views)

Hi RL,

The following upcoming Webinar may help answer some of your questions:

http://sine.ni.com/apps/we/nievn.ni?action=display_offerings_by_event&site=NIC&node=200065&event_id=14219&event_subtype=GENERAL

Regards,

Khalid


0 Kudos
Message 14 of 20
(2,407 Views)
RL wrote:
> Thank you,
>
> You are both helpful and I am getting more perspective than from the
> several telephone calls I made. I discounted the Binary format
> because of the Descriptive String. Would it be a can of worms to save
> the Time Stamp and Descriptive String to a spreadsheet (or appropriate
> searchable format) after the data has been saved to a binary file.
> The Time Stamp would be the common link.

I think this is a quite good solution. The way I sually do this is
keeping the test file (spreadsheet if you want) and the data file in the
same directory and give them actually the same name with different
endings. This file name can be and is actually the Tim
estamp 🙂

Rolf Kalbermatter
Rolf Kalbermatter
My Blog
0 Kudos
Message 15 of 20
(2,407 Views)
Thank you for spending time on this. It is in a weeks time but I doubt that I will be commited to anything before then. It appears that NI heard the cries of anguish.

Regards,
0 Kudos
Message 16 of 20
(2,407 Views)
Hi

>The way I usually do this is keeping the test file
>(spreadsheet if you want) and the data file in the
>same directory and give them actually the same name
>with different endings. This file name can be and
>is actually the Timestamp 🙂

I actually let the user select the filename (let them think they have some freedom 🙂 ) but not the location where the data is store and my timestamp is the folder name :).
So the directory structure look like this:
..\data\Test Data Files\Jun 08 2004 (152828)\my test.ini which links any binaries files in:
..\data\Test Data Files\Jun 08 2004 (152828)\Bin Data\

I am probably overdoing it with the time stamp all
the way up to seconds, but since several hardware *could* be use simultaneatly and the data *could* be exchange, I decide to play it safe.

PJM


  


vipm.io | jki.net

0 Kudos
Message 17 of 20
(2,407 Views)
I´m using LabVIEW 8.0 and Database Connectivity Toolkit to store data from a 2D array in an Oracle Database.
 
This is the structure I programmed to store data :
 
1) DB Tools Open Connectivity
2) Within a while loop, in each iteration I index each of the XY values that I convert to Decimal String together with the number of iteration and a constant. Thus, I store four values. I concatenate these four values to get the SQL statement "insert into my_table (ID_TEST, KONSTANT,FORCE, STRAIN) values(" that I connect to the DB Tools Execute Query.vi and this to DB Tools Free Object.vi.
3) When all the array has been read, the while loop ends and finally (outside the while loop) I close the database with the DB Tools Close Connection.vi
 
With the above procedure I can store the data in the Oracle table without any problem. However, to store 2 000 points (four values, columns) it takes some 3 minutes.
 
Does anyone knows of a faster way to store the data in the Oracle table? 
 
Thanks
Simbani 
 
0 Kudos
Message 18 of 20
(2,191 Views)
Hi Simbani,

I hope you're doing well.  It sounds like you are looking for a more efficient way to write your data into your Oracle database.  My bet is that the DB Tools Execute Query.vi is what is taking the most time.  I'm not sure if there is a way that we can speed up the execution time of a DB Query from within LabVIEW or the Database Connectivity Toolkit itself, as it most likely is in the query you are doing and the database that it is performing the query on.  To verify that it is the actual query that is taking the most time, you can use the VI Profiler.  To access the VI Profiler using LabVIEW 8.0, navigate to: Tools » Profile » Performance and Memory.  You can also use tick counts before and after the VI and subtract them to get execution times.  If this is the case, it may be more of a question of the actual query you are making, which may be outside the realm of my knowledge of supporting the toolkit itself, but some of the other users might chime in who have experience with SQL.  You may also want to take a look at this KnowledgeBase for some general performance tips as well.  Also, your post isn't directly related to the topic in this thread, so in the future, I would recommend starting a new thread (especially since this thread is 3 months old!).  Let us know what you find, and we will do our best to help you.  Thanks!

Thaison V
Applications Engineer
National Instruments

Message Edited by Thaison V on 09-11-2006 05:52 PM

0 Kudos
Message 19 of 20
(2,156 Views)

Hi Thaison,

Yes, I´m doing well, I hope you too.

I´ll try find out where most of the time is lost.

I´ll follow your suggestions when posting a question.

Regards

Simbani

 

 

 

0 Kudos
Message 20 of 20
(2,123 Views)