Digital I/O

cancel
Showing results for 
Search instead for 
Did you mean: 

PXI-6289 digital input sample time jitter

I am trying to use the PXI-6289 to monitor a number of digital signals. I am using the digital I/O pins for this. I am trying to use the DAQmx to read these with the 10 MHz onboard reference clock. I am reading 2,000 samples at a time, so I would expect the loop to execute every 200 microseconds (us). When I run the attached code and use a tick count microsecond timer I see that the actual loop time varies between 166-247 us. Only 34% of the loops execute in exactly 200 us. I was expecting the sample clock to do a better job controlling for sample timing and computational jitter. Any suggestions? I considered replacing the while loop with a timed loop. If I do that do I still need to use the 10 MHz sample clock?

0 Kudos
Message 1 of 3
(322 Views)

The jitter you are measuring is actually the time difference between each DAQmx Read API call.

If you want to measure the jitter of the sample clock, you should use NI DAQ to measure a known signal (e.g. 1MHz pulse from a function generator) and check the returned waveform for any distortion. See Accuracy of the Waveform Timestamp Returned by NI-DAQmx

-------------------------------------------------------
Applications Engineer | TME Systems
0 Kudos
Message 2 of 3
(303 Views)

FWIW, I would consider jitter in the realm of 40-50 microseconds in  a Windows-controlled 5 kHz software reading loop to be a big win.  That's far better than I would dare tell someone to expect.

 

Would your app allow for reading more samples at a lower loop rate?  Such as 20k samples at a time for a 500 Hz loop rate?

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 3 of 3
(278 Views)