Hi,
I am currently working on a program that scans for a signal continuously into waveforms which then uses the waveforms to create a 2D image. I have written two versions of the program:
The first program does not scan continuously. The number of scans is determined primarily by the number of iterations I specify for the program's main loop and with each iteration, the loop calls AI Start.vi and then AI Read.vi, which reads 2000 scans and outputs it as waveform data. The various waveforms acquired from the multiple scans are then used to create an image. This program works as desired.
The second program which was meant to be an improvement on the first one reads continuously. I call AI Start.vi only once (so I avoid the overhead of calling the function multiple times) and have it continuously acquiring data. I then call AI Read.vi and again, it reads 2000 scans at a time. The waveform data is sent to a queue and after the specified scanning time has elapsed, the program stops acquiring data and the post-acquisition data analysis takes place and again produces an image. While this program is MUCH faster than the original one, it doesn't work quite right. There is always "jitter" in the image, sometimes worse than others. In the first program, where I see a straight "edge" (ie. if consecutive waveforms have similar peaks), I see a jagged edge on the second program. Occasionally, the jitter is so bad that we can't use the data.
So here's my conjecture: I think that this has to do with the timing of the data acquisition. The jitter is due to consecutive scans being slightly offset from each other, the only question is why this is the case. In the first program, the scan and read are synchronus because I only acquire one wave at a time and in the second program, the data is being continuously acquired so perhaps the "start" of each wave is read differently. Is there an efficient way of making my data acuisition more regular?