Hello-
I am trying to output an analog waveform for a set amount of time based on a digital trigger. I have modified the Cont Gen Voltage Wfm-Int Clk-Dig Start example, modified for a set number of output samples, and set to stop once the output is done; then all put in a while loop for multiple outputs. The hardware is the 6715 DAQCard with LabVIEW 8.2.
The problem I am facing is that the trigger is only ON for 10 microseconds, up to a max of about 100 microseconds; and each loop only runs for around 100 milliseconds, for 256 loops. With this configuration, the trigger misses atleast half of the triggers. We have a feeling that this is because of an inadequate sampling rate for the digital trigger. The sampling rate for waveform output is set to 500 kS/s, but changing this does not change the number of triggers measured. Is there a way to change the sampling rate for the trigger? Looking at examples online, it appears that in past editions of LabVIEW, this rate could be controlled.
The other concern is that the hardware simply can't handle that sampling rate. When increasing the loop time to 200 milliseconds, all triggers were accounted for, but it is unclear whether that is due to the longer time or rather the sampling "lining up" with the trigger signal. There are some operations being carried out within the while loop, but they are mostly math operations to modify the waveform.
Any suggestions would be greatly appreciated. Thank you
Curtis Johnson