02-25-2012 05:24 PM
I am using LabVIEW to capture images from my GigE camera and would like to implement real-time summation of images. What I mean by this is that for a given number of frames, images are added together sequentially with the resulting summed image being displayed as images are being captured. As I am fairly new to using Labview for imaging applications, what is the best way to go about this? I seem to recall reading about the possibility of overflow/saturation when summing multiple 16-bit images in LabVIEW. Since I have set my camera to output images in a monochrome 16-bit pixel format is this something I need to worry about?
02-25-2012 06:35 PM
Do you really mean "summation", or "averaging"?
If you take each monochrome pixel's 16-bit value, adding two together could easily saturate (reach the upper storable limit) for a 16-bit value, which is 65535. As each image is 'added' the numbers all start to get whited out at this maximum. This is ok if your images are particularly dim, but if they are not then the image will quickly become a full rectangle of white.
Now, if you mean averaging, then this would seem more reasonable. The average of two or more images would show the average brightness at each pixel, and hence give you a mean represetation of all the captured data.
Are you trying to implement a simple average/summation of a set number of frames, or a more complicated moving window of averages/sums?
If you want to do this 'live', then you are likely to need to store all the images being acquired in a buffer and perform the calculations at each frame capture.
02-25-2012 10:35 PM
Thank you kindly for your response. I think image summation is what I am looking for. I am taking rather low-intensity images; as an example, I typically need hundreds of images summed together with my camera set for a 1 second exposure time to get a resonably bright image (this was when I could only save individual images one at a time and sum them afterwords). Also, If I'm not mistaken, averaging images is good for eleminating random sources of noise. In my case, however, there is always an explicit source of background noise that I need to subtract away later. All I'm looking for right now is just a simple summation of a set number of images where I can see the result of the summed image in real-time.
02-27-2012 10:58 AM
Yes, it does sound like summation is what you want. Are you using the IMAQ toolkit? If so, there are functions for performing mathematical operations on images, including an "Add" function. Use this to sum two images together. If you perform this function successively to your stored image with each newly acquired image, you will get a rolling summation. You can then display this on the screen as the images are acquired and summed.
You will need two IMAQ image buffers, one for the incoming images, the other for the stored summed image. This second is the one you need to display on-screen.
02-28-2012 05:57 PM
Thanks again for your help! After putting together a VI, however, I've run across an issue that I'm not certain is related to my implementation. I created a test image by summing 200 images together at 22 FPS with no external light sources, just to get an idea of what sort of background I'll have to deal with. However, I've noticed that the image is noticeably asymmetry in terms of intensity; specifically the image is essentially dark starting on the right side but as you move across, the image gradually brightens to a maximum at the left side. As I am quite certain the camera is shielded from all external light sources, is there something amiss with the way I am summing the images together? I've attached the image I took, along with the VI (the image is posted as a JPEG to reduced the file size; however the VI saves images in PNG format).
02-29-2012 03:44 AM
I took a quick look at your code. I don't have IMAQ installed so I can't load it entirely, nor run it of course, but I can see that you appear to be calling the SUM function correctly.
One point: You create a large number of image buffers in preparation for each acquired image. This will use a lot of memory. You could just use one single buffer for the incoming image, replacing the content with each acquisition, and one buffer for the stored summation image.
So why do you see an increasing brightness towards the left of your image? Well, I suspect with mulitple summations of a dark image, any very slight graduation in the image will get exagerated with each iteration. Therefore, if the pixels towards the right are near value 10, but the pixels towards the left are closer to 20, both with appear as virtually black to the human eye, but after 200 summations this becomes 2000 on the right and 4000 on the left, which will then show as a very obvious gradient. I suspect there's nothing wrong here, it's just highlighting the inaccuracies in the camera.
03-18-2015 02:05 AM
Hi,
I am also working on the same project to get display all the captured images coordinates wise using logitech HD camera like first to captured at 0,0 position, then so on. The above VI is not working showing error in IMAQ write png file subvi.
Thanking you