LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

When will the timer in a stack sequence start to count down?

Hi there,

  I am reading some code developed by someone long time ago. In the code, there used stack sequence structure to perform operation on external devices in serial. In each block of the structure, there will be a timer to delay the time. In each block, there are two IO operations on the analogy terminals and digital a terminal in the first block. but those operations are all independent. Let's say one operation is output a digital 5V  to switch on a shutter so some light can go through and shine on a amplifier. The manify factor of the amplifier was control by sending analogy voltage , and there will be analogy voltage sent to the amplifier to control the manify factor before the reach of the light. Here is my questions:

 

1) In the code, I saw that the digital and analogy output are placed in the same block, so what's the order of the signals output? Will both analogy and digital signal be sent out simultaneously? Or any possibility that analogy signal or digital signal will be sent ahead the other? I am thinking if the order of signal output are random, it is possible that the amplifier will not be set before the shutter is open.

 

2) In each block of the stack sequence, there will be a timer to delay some amount of time before jump to the next block. I want to know when will the timer start to count down? It will count down right on it's starting to run code in the block or after all operations in a block have been executed?

 

Thanks

 

p.s. I am running the code in labview 7

0 Kudos
Message 1 of 4
(2,931 Views)

Instead of lenght confusing descriptions, show us some code!

 

If a frame contains several operations plus a wait, the wait will start concurrent with the operations. (Of course on a single processor system the operations cannot all execute at exactly the same time, so their execution order is not determined. Still the total time to execute the sequence frame will be the operation that takes the longest time, which is typically the wait.)

0 Kudos
Message 2 of 4
(2,920 Views)

@altenbach wrote:

Instead of lenght confusing descriptions, show us some code!

 

If a frame contains several operations plus a wait, the wait will start concurrent with the operations. (Of course on a single processor system the operations cannot all execute at exactly the same time, so their execution order is not determined. Still the total time to execute the sequence frame will be the operation that takes the longest time, which is typically the wait.)


So does it mean the wait will start count down when the current frame being executed? I write a code to test that

 

testtimed.png

 

and I figure out that the wait will be triggerred once the frame being excuted, that means the wait count down could be finished before all other code's complete in the same frame. I thought wait is used to control the delay after all other codes in that frame finished but it turns out that it's not the case. So my question is if I want to control the time delay after one frame, I should insert one blank frame inbetween and add the wait component there. But how accurate it is to control the wait time? As told in the help, the resoultion of the wait time component is millisecond, is that really that preicse can control the wait time within one millisecond? I need to separate two frames with 1 ms EXACTLY but it seems not that precise (that will control the external shutter on/off interval).

0 Kudos
Message 3 of 4
(2,857 Views)

@dragondriver wrote:
But how accurate it is to control the wait time? As told in the help, the resoultion of the wait time component is millisecond, is that really that preicse can control the wait time within one millisecond? I need to separate two frames with 1 ms EXACTLY but it seems not that precise (that will control the external shutter on/off interval).

This is a limitation of the OS, not LabVIEW. Maybe you should use LabVIEW RT or FPGA. What should actually happen in your program at the start and end of the 1ms wait? Is this an alalog or digital output, for example?

 

Even under plain windows, you can always e.g. output some hardware timed waveform that goes true for exactly 1ms. What kind of DAQ hardware do you have? 

0 Kudos
Message 4 of 4
(2,848 Views)