08-23-2011 10:53 PM
May I know how should I set in my program, to read the data every 1.000second exactly.
I'm using USB-6210 for continuous acquisition.
To set the sampling rate
DAQmxCfgSampClkTiming(gTaskHandle, "",rate,DAQmx_Val_Rising,DAQmx_Val_ContSamps,sampsPerChan));
To read the data
DAQmxReadAnalogF64(taskHandle,nSamples,10.0,DAQmx_Val_GroupByScanNumber,gData,nSamples*gNumChannels,&numRead,NULL);
I set rate = sampsPerChan, which means I suppose to read the data every second.
I use clock() to show me the time up to milisecond.
After 100 seconds running, it is running slower 0.002 second.
And different computer will give different time after running the program for few hours, some are faster but some are slower few seconds.
08-24-2011 12:05 PM
Hi moonlotus,
rate should be equal to nSamples (the paraemter in DAQmxReadAnalogF64). The sampsPerChan paramter in DAQmxCfgSampClkTiming is jused used to set the buffer size on continuous tasks and doesn't affect how frequently you read data.
Assuming rate is equal to nSamples, you should be reading data every 1 second within 50 parts per million as per the specifications. So, after 100 seconds of running you might see up to a .005 second variance due to the 50 ppm accuracy of the timebase. The .002 seconds you are seeing after 100 seconds is therefore within the specifications of the clock accuracy.
Don't forget that the computer clock also has its own inaccuracies which would also contribute to the measured error.
Best Regards,
08-24-2011 07:57 PM
Thanks for the explanation.
In this case, can I use other source terminal to increase the accuracy?