05-12-2012 12:54 AM
Hello,
It would be great if someone could help me with the following problem:
I have a Logitech webcam. I use NI Vision Acquisition software with Labview to communicate with the webcam over USB. I snap images and convert them to grayscale. Then I do image to array and sum over all array elements to get total intensity value of all pixels in the image grid. This runs in a while loop so that I can get the image and this value like video from the webcam. The frame rate for the image grabbing is 30.0fps. The data of the value is plotted over time in a chart.
Now, if I cover the webcam lens (basically no light incident on it) I would expect the pixel values to go down (more black = the total value goes down) but instead the frame rate slows down to 6 - 7 fps. Can anyone tell me why this is happening?
the ultimate goal of the project was to aim the webcam at a laser beam coming out of a confocal microscopy set-up beam spitter cube and measure its power to help in focusing the microscope lens. the pixel value sum in the vi would work as the integrated power. as the power is varied, i would ideally expect the total value to reduce, not the frame rate.
Can someone help me with this please?
I have attached the vi.
Some help is required urgently.
Thanks a lot.
:mansad:
Solved! Go to Solution.
05-12-2012 03:13 AM
It's been a long time since I played around with a camera but to me it seems like you have the settings of the camera still on auto.
If you just put the light/contrast settings to a fixed value I would expect this problem to be gone. Maybe some more 'auto' setting to be turned off?
05-12-2012 09:14 AM
Yes. You were correct. I had put the brightness and contrast on manual but missed the exposure when I set the capture settings.
And now it works.
Thanks a lot.
05-14-2012 09:45 AM
A word of caution- if you are summing all pixels in the image, you may have a significant amount of non-zero pixels even in total darkness.
This value may change with temperature and time.
You may want to run a test with all light blocked from entering the camera from time to time to see what the summation amounts to, just could use the value as a dark offset to apply.
Then again, if your beam is bright enough, the dark level may hardly be significant.
-AK2DM
05-14-2012 04:13 PM
thanks for the suggestion. this is a good idea and may contribute to some amounts of realistic error in my experiments. the beam is however extremely bright since it is from a pretty powerful laser and this makes me think that the level of error may not be very significant. but i will try out what you suggested just to be sure.
thanks a lot.
05-14-2012 04:24 PM
The other thing you need to avoid is saturation. If any pixels are saturated, the measurement is not valid. You need to attenuate the beam sufficiently that no pixel hits its maximum possible value. Then the dark current noise may not be so insignificant.
Defocusing the beam so that it spreads over much of the image aperature may help. Just do not throw away part of the beam.
Lynn
05-14-2012 04:24 PM - edited 05-14-2012 04:29 PM
For sure it sounds like there's an auto-adjust setting on somewhere. What's happening is the shutter speed is decreasing to allow more light onto the sensor since you're probably limited with a small aperture range on that camera. (Doubling the f-stop number (smaller aperture hole) and doubling the shutter speed (longer shutter time) will give you the same exposure value).
Make sure it's not too bright... intense light sources have a tendency to damage sensors ($$$ on DSLRs; always want to use a ND filter if you're shooting the sun/welding or have really bright reflections of either).
You can use a neutral-density filter available at most camera shops to knock the brightness into the camera down to reasonable levels. 6- to 8-stop filter is about what welding masks look like; a 12-stop filter is also known as "black glass" for a good reason. You might need something that dark to bring the light onto the sensor down to reasonable levels if your laser source is bright enough...