Counter/Timer

cancel
Showing results for 
Search instead for 
Did you mean: 

Count rising edges of counter with Python

Solved!
Go to solution

You're absolutely right. I am extremely constrained by the few things I know at the moment.

 

We need digital outputs. But I thought, that the counter outputs are kind of digital, too. I guess that's not the case!?

 

The two pulses I drew above are one-time-only and the counter / clock with the threshold is just to set up the correct time between pulse 1 and 2 and that time is not always the same.

If we need a measurement with a delay of 1ms for example, we will use the 100kHz clock and the threshold will be 100.

f = 1 / T = 100 kHz --> T = 1e-5 s --> after 100 clock periods we have 1 ms (it's just a quick example, hopefully I didn't miscalculated)

 

Pulse 1 comes along and as soon as this happens, there must be exactly that 1 ms time window before pulse 2 starts.

Within one measurement this 1ms is fix, but if another person needs a 100µs time window some day, this should be possible, too. Maybe by assigning another clock and/or threshold.

High and low time of the pulses are also fix during one measurement, but in general they should be able to be configured.

 

Maybe you can tell me how to do this in LabView and hopefully I can do some reverse engineering (please keep in mind: I don't have a clue about LabView).

Or would you mind to explain a little more detailed, what this buffered digital output and the sample clock is, please? 🙂 I think this would be helpful.

0 Kudos
Message 11 of 15
(2,645 Views)

If you can't find examples for finite DO with a sample clock under Python, you should at least find some for finite AO.  Very little of the config will need to change for DO, only the data you write to the task to define your 2 pulses with a fixed delay between them.

 

You will define a buffer of values for each of the 2 DO lines.  It will need to contain enough samples to last just over 1 msec at your chosen sample rate.  Almost all the values will be False, or 0, or whatever syntax Python uses to describe a digital low state.   At very specific indices in that buffer, you'll set a couple key values True or 1.

 

Let's just suppose a sample rate of 100 kHz.  You'll need at least 102 samples to make sure you can define 2 pulses that go low->high->low again and are 1.000 msec apart.

 

Only the 1st sample of one line and the 101st of the other should be set True or 1.  All other samples on both lines are set to False or 0.  Write this data to the task, start it, and wait for it to complete just over 1 msec later.  And that's really all there is.

 

When someone else wants to run with different time spacing, you only have to change which value in the buffer you set True.  If you want to make it easy to accomodate pulse spacing up to 100 msec, you'd make a buffer that initializes both DO lines to 10002 samples of False or 0.  Then you'd set the 1st and the <Nth> sample to True or 1 to accomodate the desired spacing.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 12 of 15
(2,634 Views)

Okay, I tried to do what you said.

This is what I got (Python Code):

 

task = nidaqmx.Task()

task.do_channels.add_do_chan("Dev1/port0/line0")

task.timing.cfg_samp_clk_timing(1, sample_mode=AcquisitionType.FINITE, samps_per_chan=2)

task.write([True, False])

 

task.start()

task.wait_until_done()

task.stop()

 

This generates a 1s high (and after that for ever low, but that's okay).

Very good!

But I think this isn't very scalable, is it? I have to do a little bit of math every time someone changes something.

Can't I just apply a clock to two DOs and say:

DO1 be high, now ... wait delta t ... DO2 be high, now! 😄

0 Kudos
Message 13 of 15
(2,623 Views)
Solution
Accepted by soxxes

Computers are pretty good at "doing math every time someone changes something."  I wouldn't let that be a roadblock.

 

Here's some very crude pseudo-code to help illustrate what I mean:

CONST double max_duration = 0.101;  // plan to generate for 101 msec
double  sample_rate=100000; // plan on a standard sample rate of 100 kHz
double delay=get_user_input("How many seconds delay between the pulses?");
int num_samples = int(max_duration * sample_rate);

boolean pulse_1_buffer[num_samples]; // wrong real-life syntax, use dynamic allocation
boolean pulse_2_buffer[num_samples]; 

// initialize buffers of digital line data to be all False
for (int i=0; i<num_samples; i++)
{
    pulse_1_buffer[i] = False;
    pulse_2_buffer[i] = False;
}

int num_samples_delay = delay * sample_rate;
pulse_1_buffer[0] = True;
pulse_2_buffer[num_samples_delay] = True;

// Now create your DO task for 2 lines and write these buffers to the task.
// API syntax may require that you have a single buffer variable that's
// effectively a 2D array.  (Left as an exercise for the reader) ‌‌

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 14 of 15
(2,619 Views)

Ohhh yes! It worked. I tried a mix of your pseudo code and mine and it worked!

The buffer has to be a 2D array. Brilliant comment!

And I didn't know, that I have to put both DOs in one task. I tried the whole time to assign one task to one DO and another to the other DO. Kind of stupid with hindsight.

 

You are right, a computer should calculate this easily. My task is now, to implement this as user-friendly as possible. 😄 But I am confident.

 

Kevin, you really really helped me out. Thank you very much!

Am I able to give you something like an award or can I at least mark your answers as helpful?

If you need some help with Python one day, please let me know. I would like to give something back to you. 🙂

0 Kudos
Message 15 of 15
(2,612 Views)