LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Synchronized Acquisition from Two USB Cameras (NI IMAQ, Vision Development Module)

Hi,

 

I am using the NI Vision Development Module with two USB 3.1 cameras to acquire and save pairs of images for stereovision calibration in a separate software.

The cameras are: https://www.flir.com/products/blackfly-s-usb3/

I have the camera acquisitions running in the same loop, but there is a significant time delay between the paired images. This is noticeable when I try to run the images through my calibration software (bad calibration scores). I am assuming that this is because the cameras are acquiring images at slightly different times in LabVIEW. I need to get the delay between the images down to a few microseconds. Can I accomplish this with the full suite of VIs provided by the Vision Development Module, or do I need to employ hardware triggering as well (as outlined by FLIR's website)? Any feedback on the small code snippet I have attached is welcome. I have opted to use local variables instead of queues because I only want to save the newest image in the live camera feed. If there is a cleaner/better way to accomplish this I am open to suggestions.

 

Thanks

0 Kudos
Message 1 of 4
(3,166 Views)

Your code doesn't look inherently problematic (at least, I didn't see anything that jumped out as a cause of synchronisation definite problems).

 

However, for sub-millisecond synchronisation you'll (almost?) certainly have to use hardware triggering. I don't think it's likely you'll manage it otherwise except by chance.

 

In terms of queues vs local variables, you should note that a queue prevents data loss (you already did note this, I suppose) but that if you allow loss, it's not certain that both will lose the same number or in the same position. Since you're reading them in your processing loop with no included information about the frame, you're reliant upon having the same iterations' images for both to be even remotely synchronised. So I'd probably suggest a queue instead. If you want fewer data points, you could implement some decimation in either producer (Acq) or consumer (Processing) loops before saving.


GCentral
Message 2 of 4
(3,015 Views)

@cbutcher wrote:

 

However, for sub-millisecond synchronisation you'll (almost?) certainly have to use hardware triggering. I don't think it's likely you'll manage it otherwise except by chance.


Even millisecond accuracy is basically unreachable without hardware triggering. The Windows OS has absolutely no determinisme guarantees and while you usually can get the timing fairly consistent to several milliseconds, there is no guarantee that it can't be occasionally a second or more delay if Windows decides that it has something more urgent to do. This is also valid for all other desktop OSes, be it Mac OSX or Linux, although on Linux you can tweak things to some extend by command line options and kernel configuration options, if you dig deep enough and care to spend that time. But millisecond accurracy is only achievable on true real-time systems and several microseconds definitely only directly with hardware triggering and/or in FPGA programming.

Rolf Kalbermatter
My Blog
Message 3 of 4
(3,006 Views)

All,

 
Thank you for your responses. I will look into hardware triggering if my current solution stops working. With some software tweaks, I managed to consistently get images that are synchronized (enough) for my application. I’m sure that there is still some sort of delay, but it is definitely reduced and my results are consistent. I’ve attached my updated VI below.
 
Long story short, I created a separate script for saving calibration images and one for image processing (I didn’t include the rest of my processing loop in the original VI I posted, it is computationally expensive, ~2sec loop time). I replaced the get image VIs with grab VIs and wired false to the “wait for next buffer” input. I also upped the acquisition frame rate to 15fps (about the max for my cameras). I then moved the save image case structure to the inside of the acquisition loop, removing the local variables. Again, this isn’t perfect synchronization but I am now able to consistently calibrate my stereovision system with image pairs. Hopefully this is useful to someone else.
Message 4 of 4
(2,913 Views)