NI TestStand

cancel
Showing results for 
Search instead for 
Did you mean: 

What is a good strategy to reliably log test data to a network database?

Getting data from Teststand into a database reliably given network glitches and outages, and even database outages, seems to be a pretty common concern. That NI doesn't just add a setting that can be enabled tells me that there isn't a universal solution. I've read through a number of threads on this forum, and I haven't really seen anything that gives a complete 'big picture" approach, but a good solution looks like something that leverages TS's built in database logging and/or offline results processing. The "big picture" solutions I have surmised from the forum threads, and from experience with databases are:

 

1) Disable database logging, enable saving to offline results files, and run a task (that runs the TS Offline Results Processing Utility) in MS Windows from a TS callback to process the files. The task would have to validate that it is able to talk to the database, and to pause if it can't.

 

2) Monitor that the database is available, and enable either TS's database logging if the database is available, or offline results if it is not. When the database is available, process any offline results, again with a task in MS Windows.

 

3) Write to a local database on the test station computer, and use existing database tools to copy the data to the networked database when it is available. (I looked at database replication awhile ago. I don't remember the details, only that it didn't seem like a great solution.)

 

Questions:

1) Are there any other approaches that I am missing?

2) Are there problems with any of these approaches? Intuitively, disabling TS's database logging seems like a not so great idea.

3) Why can't NI provide this functionality out of the box? (This is just out of curiosity)

4) Do TS's Database Logging and Offline Results Processing Utility already gracefully handle database connectivity glitches?

 

David Grucza, CLD
0 Kudos
Message 1 of 5
(1,356 Views)

I forgot a 4th option that doesn't involve the test stations at all: Maintain a local database server, and use database replication to move data to a central server. A local server is highly reliable, so test stations would rarely encounter errors writing to it.

David Grucza, CLD
0 Kudos
Message 2 of 5
(1,316 Views)

Hi David, 

 

While I don't have anything that you've not already covered, given the need for the local logging and replication anyway (e.g., when there are outages), the local DB with replication solution has worked well for us over time. 

Regards, 

-Thom

Message 3 of 5
(1,301 Views)

Thanks for the input Thom. I had sort of discounted this approach because the department manager doesn't want to deal with hardware. But to me it makes sense: Pay Microsoft or Google (or any other company) to administer the database on the local server, as well as a cloud server. Having the hardware on site is really easy.

 

The only disadvantage of out-of-the-box replication is if there are multiple sites that need to send data to the cloud server, and you don't want all of the sites to have data from all of the other sites. e.g. You don't want data from one CM to be viewable by another CM.

David Grucza, CLD
0 Kudos
Message 4 of 5
(1,296 Views)

Hi  David_Grucza, 

 

Thanks for the KUDO.  

 

Just a bit of clarification, though:

 

For us, the bigger concern are LAN outages; common enough that it exposes the issue. 

 

Also, I don't mean to suggest that additional hardware is needed.  Assuming there is a computer involved (e.g., the one running the test application) that can run a local 'DB server' (the term is quoted to indicate that really any local data source / sink will do, e.g., excel files, flat files, JSON txt files, MS SQL Server Express, MS Access file, any ODBC data source, etc.). - any local sink is likely sufficient [unless there is ALOT of data or it is coming in super quickly]), no additional hardware should be needed. 

 

This local 'buffering' of the data simply ensures that the data is captured  There is additional overhead, yes.  That is the replication (traded for all the overhead in managing this some other way). Consider building the 'replicator/date sync-er' separately from the test application and have it just running on the machine at startup - syncing when needed.

 

Regards, 

-Thom 

 

0 Kudos
Message 5 of 5
(1,285 Views)