09-10-2015 05:01 AM
I have a C++ application that uses NI-DAQmx Base to read up to 5 input signals at 1Khz each. Using simple sequence
- DAQmxBaseCreateTask()
- DAQmxBaseCreateAIVoltageChan()
and then DAQmxBaseStartTask()
while reading all the samples, I also run a simple loop that sends TTL signals to devices, and reads input from another TTL source. Also, images are displayed on the computer screen with a specific timing that is managed by the CPU clock.
my question is : how can I synchronize the CPU clock with the Ai samples ? When I start the task, what is the delay before the first sample is taken ? How can I optimize this as to minimize that latency (if any) ? I just want to know what the Ai value was, for instance in the interval [-100, 100] msec around a specific moment of the CPU, and there should be as little delay as possible
thanks !
09-10-2015 08:52 AM
replying to myself : suppose I am reading 2 channels at 2KHz, but only the 1st channel is read all the time for about one hour. The 2nd channel is only read occasionally, for approx 1 second. So I can throw away 99% of that data. How would you recommend syncing that time ? These 1-sec-slots are NOT known beforehand : they are calculated based on the 1st channel, and I would like to sync these 1-sec-slots as accurate as possible
so I assume I will have to make sure that I read and process the buffer of channel1 as quickly as possible (if I only read it once every 10msec, I can never grab the 2nd channel data more accurately than 10msec)
I can't start/stop an individual channel if i'm right ? Only a task can be started/stopped and I assume that starting/stopping a task is going to be CPU-intensive causing me more delays that might make the 1-sec-slot start-time more inaccurate ?
I could of course just keep ALL the data and handle things in post-processing, but that isn't possible since I sometimes will need "live" action to be taken based upong signal calculation, and I want this live-action to be as time-accurate as possible
so basically, I have several channels of data at 2Khz, and I want to mark/process that data as time-accurate as possible. That's why I'm looking to sync the clock of the samples with the clock of the processing PC