Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Can I synchronize change detection to a timebase that is slower than 80 MHz?

Hello, 

 

Is there a way to synchronize the change detection process to a slower clock than the 80 MHz timebase?  I am using a PCI-6259 board to detect changes in multiple digital input lines, store each change, and also store a time stamp for each.  I am using a C program and DAQmx function calls.  I use DAQmxCreateDIChan() and DAQmxCfgChangeDetectionTiming() to collect the changes on any of 32 lines into a buffer.  I create a counter with DAQmxCreateCICountEdgesChan() that counts the edges from a 20 MHz signal.  I direct the ChangeDetectionEvent signal  to the Sample Clock of the counter, using DAQmxExportSignal().  So, it stores the count so far whenever there is a change.

 

It works fine up to a point.  But our application can have simultaneous changes on more than one input line.  At 80 MHz, the change detection process sometimes stores these in a single data word, but it often clocks in separate data words for each changed line, presumably because the changes were detected across the 12.5 nanosecond clock period.  That would not be a problem, except that the ChangeDetectionEvent signal is 150 nanoseconds wide, and if 2 of these are sent out too fast they overlap and the counter only sees one of them.  So, only 1 timestamp is stored and the data can not be correlated.

 

If I could slow down the change detection, the signals to the counter would come slower and this might solve the problem.  I have tried the following calls for the change detection task :


DAQmxGetSampClkTimebaseSrc()
DAQmxGetSampClkTimebaseRate()
DAQmxGetMasterTimebaseSrc()
DAQmxGetMasterTimebaseRate()

 

All of them give an error saying that they are not supported by my device.


Is there any other way to slow down the change detection? I did not find any documentation for

this.  Is it included in any of the help files or manuals?

 

Thanks for your help,

 

Jane

 

0 Kudos
Message 1 of 4
(3,632 Views)

I don't think there's a way to configure the way that change detection is sync'ed.  I believe it's part of the board circuitry.  I was in a thread about this stuff a while back -- look here for the two posts by Ryan from NI, messages 34 and 35 in the thread.

 

So it sounds like you're kinda stuck.  I briefly tested the idea of configuring an M-series counter task to use dev1/di/sampleclock as its own sample clock, but that produced an error due to being unroutable.  (I hoped this might be a sneaky way to use the internal signal that causes DI change detection buffering, rather than the 150 nsec ChangeDetectEvent pulse).

 

Any chance your app can be reworked to use constant-rate sampling, and then you can perform post-processing to find the changes?  If not, my only other idea is to suggest you look into a product from one of NI's Alliance partners -- Viewpoint Systems. It's a digital board that can do change detection at up to 20 MHz.  What makes it a bit unique compared to NI's boards is that when you perform change detection tasks, it *always* stores a timestamp for each sample.  You don't have to route out to a counter.  (Better yet, you could later use that same timestamp / port state data to generate an exact copy of the digital pattern you captured.  The beauty is that you are only required to define digital port states at the timestamps where transitions occur.  You get 20 MHz timing resolution for digital pattern output without needing to define 20 million mostly redundant patterns per second of output.   I used an earlier ISA-bus version of that board to generate precision-timed stepper motor trajectories.)

 

-Kevin P.

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 2 of 4
(3,627 Views)

Hi Kevin,

 

Thanks so much for responding.  I had already read your posts and Ryan's posts in the previous thread. He also measured the width of the ChangeDetectionEvent signal to be 150 nanoseconds, so I know this is the limitation. 

 

I glanced at the web page for the DIO-64 from Viewpoint systems, and it looks like it would do the job, but is pricey.  I will look further to see if there is anything with only 32 bits.  And  1 or 2 MHz would suffice for our data.

 

Doing constant rate sampling would not be practical at the high rates we would need to sample and store data using DMA in Windows.  I am trying to put together a new system to replace an older event timer that one of our groups has, which is an ISA device.  It works well at 1 MHz (a custom device from our own engineers here.)  We work with neurophysiological spike data along with auditory signals and other events.

 

I do have another idea that I plan to try next week.  Since there is another counter on the 6259 board, I will configure it the same way as my first counter, but use the falling edges of the ChangeDetectionEvent as its clock.  I can look at the counts from both counters (at 80 MHz instead of 20), and see if consecutive times are always ~150 nanoseconds apart. Not sure if this will work, but it is worth a try.

 

Thanks again for your help,

Jane
 

0 Kudos
Message 3 of 4
(3,622 Views)

Hi Jane,

 

It appears that you have posted the same question in another post, found here. I am currently examining the problem along with Brooks, and we will post a response on the original thread, as we prefer to keep each issue together in a single thread. Thanks for your patience and understanding!

Daniel S.
National Instruments
0 Kudos
Message 4 of 4
(3,574 Views)