LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Log data before and after occurance of a Trigger Condition.

Solved!
Go to solution

Hello,

 

I am worrking on a application that acquires data from 64 channels and performs many analysis.

One thing needed to be implemented is logging of data when alarm condition occurs.

I have implemented datalogging in many previous applications, but this one is tricky.

 

I need to log data of few seconds before and few seconds after the alarm trigger has occured into one file. 

 

I thought of continuously writting data to a TDMS file and simultaneously deleting old data, untill the alarm trigger occurs.

but i haven't been able to do it, since i do not have the blocks to delete data from a tdms file.

 

I am looking for the ideal approach to do this logging.

 

Any help will be appreciated.

 


Regards.
Digant Shetty (LV 18.0)
AE, Combined Digilog Systems Pvt. Ltd.

CDSPL LOGO.pngAlliance Partner.png

0 Kudos
Message 1 of 8
(4,387 Views)
Solution
Accepted by topic author psystein

What I have for this in the past is use a lossy queue as a circular buffer.  When you get the trigger, you dump the data in the queue to your file and then record however much data you want after the trigger.

 

For memory allocation reasons, do not use the Flush Queue.  Instead use a Dequeue Element inside of a conditional FOR loop with a 0 timeout (stops reading from the queue when you have a timeout or you read X samples).


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 8
(4,361 Views)

I agree with Crossrulz, the finite length Lossy Queue is your Friend.  We are recording videos and want to record "before, during, and after" an event.  We create a Lossy Queue large enough for the Before videos, and always "enqueue" into this Lossy Queue.  When the Event occurs, we start dequeuing "as fast as we can" from the Lossy Queue, and continue dequeing until the "After" period expires, at which point the Queue is allowed to "refill".

 

Bob Schor

0 Kudos
Message 3 of 8
(4,353 Views)

I have been trying to avoid using Queues.

So i was trying to used a TDMS file instead. 

 

I may have found the solution to not being able to delete data from a TDMS file. 

I can over-write the existing data in a TDMS file.

This is based on the topic on : https://forums.ni.com/t5/LabVIEW/Rewrite-the-tdms-file-keeping-some-of-the-data-values/td-p/3131696 

 

By avoiding queues, I do both buffering the data and then writting to file both in effectively a single operation.

 

Am gonna try that now, if not that you suggestion will do it.

Thanks.


Regards.
Digant Shetty (LV 18.0)
AE, Combined Digilog Systems Pvt. Ltd.

CDSPL LOGO.pngAlliance Partner.png

0 Kudos
Message 4 of 8
(4,335 Views)

TDMS cannot replace a queue, and I see little reason to avoid them.

 

Also starting in LabVIEW 2015 there is a TDMS Delete Data function which operates on a closed TDMS file, performing a close, delete, then re-open might be an option if you are dealing with a reference.

0 Kudos
Message 5 of 8
(4,299 Views)

Thanks for that input Hooovahh.

I am aware of the dedicated block available in LV 2105, however I am not on that version yet.

 

I am using the "TDMS Set Next Write Position" block to over-write the data already written to the file.

However i havent been able to test the functionality of the code i have built. 

 

Kindly elaborate on "TDMS cannot replace a queue, and I see little reason to avoid them."

 


Regards.
Digant Shetty (LV 18.0)
AE, Combined Digilog Systems Pvt. Ltd.

CDSPL LOGO.pngAlliance Partner.png

0 Kudos
Message 6 of 8
(4,273 Views)

Hooovahh,

 

I have just tried the code with TDMS and the results are not what i am looking for.

I need some help on how i am to implement the needed functionality using lossy queues.

 


Regards.
Digant Shetty (LV 18.0)
AE, Combined Digilog Systems Pvt. Ltd.

CDSPL LOGO.pngAlliance Partner.png

0 Kudos
Message 7 of 8
(4,262 Views)
Solution
Accepted by topic author psystein

From the description in your marked Solution and my following comment, you should be able to figure this out.  The notion is that when you create the Queue, you set its length to the finite size that exactly accomodates your "pre-trigger" data.  You always do a Lossy Enqueue, so the Queue never blocks.  As long as the trigger hasn't occurred, you do not Dequeue, you just let the Lossy Queue fill and overflow.

 

When the Trigger occurs, you "switch states" and start dequeuing as fast as you can (if you really want to ensure that you don't miss anything, you can dequeue into a "regular" queue, but this slightly complicates the logic).  By keeping track of how many elements you are removing from the Queue, you can tell when you've removed (a) the pre-trigger elements + (b) the post-trigger elements, at which point you can stop dequeuing and let the Queue fill up again.

 

There is one "catch" to this scheme -- what happens if a trigger occurs during the post-trigger interval, before you "go idle"?  I don't remember how we decided to handle this, but the easiest thing to do is to simply ignore that trigger ...

 

Bob Schor

Message 8 of 8
(4,237 Views)