LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Counter Input timing difference between PCI-6251 and PCIe-6374

Using the PCI-6251 for CI-Period measurements, when I give the start trigger to the counter will immediately start measuring to the first rising edge and store this time. It will continue measuring the time between rising edges and storing these times. I would like to try and replicate this function using a PCIe-6374. After only making changes to the code to use the PCIe-6374 (Dev 3) instead of PCI-6251 (Dev 2) the device starts measuring from the first rising edge instead of from the time it was given the start trigger. Therefore I am missing this first time measurement value when using the PCIe-6374.

Does anyone know how I can make the PCIe-6374 measure in the same way as the PCI-6251 as described above?

0 Kudos
Message 1 of 2
(183 Views)

This is a built-in behavior change from M-series to X-series counters.  There's nothing you can do about it *directly*.  It seems to be called "Incomplete Sample Detection", and I posted about it to the DAQ Idea Exchange several years ago.  Read the comments and follow through to the related discussion for more context.

 

It sounds like you could potentially make use of the "equivalency" idea I outlined there where you have to think kinda "inside out" about the counter signals.  Here, instead of using period mode directly, you can fool your hardware by programming an edge counting task instead where you've interchanged the signal assignments.

 

To break it down in plainer terms:

    In normal period measurement mode, you only identify the signal to be measured.  That signal gets routed to the counter's gate pin where it behaves much like a sample clock.  Meanwhile, behind the scenes a known internal timebase gets routed to the counter's source pin where every clock cycle increments the count register.  The source signal increments the count, the gate signal captures / samples the instantaneous count.  And one last special thing in period measurement mode is that the count gets internally reset back to 0 immediately after capture.

 

Ok, take a breath, here comes the equivalency stuff.

 

You could instead program an edge counting task where you identify that same internal timebase as the edge counting source and then identify the signal to measure as the sample clock source.  Deep down in the hardware, the counter sees the same signals routed the same way and behaves pretty much identically.  The main differences are:

 

- in edge counting mode, the device won't reset the count to 0 after each sample.  You'll need to do some simple post-processing to find the finite differences

- you won't get the benefit of DAQmx doing the math to give you a period in units of seconds.  You'll get raw counts and you'll need to account for the actual timebase frequency to convert to seconds.  There are DAQmx properties that can be used to determine the timebase frequency used in the task.  Note that your devices have different max timebases (80 MHz and 100 MHz), but they both have a 20 MHz one.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy coming to an end (finally!). Permanent license pricing remains WIP. Tread carefully.
Message 2 of 2
(155 Views)