06-07-2011 08:42 AM
Hey everyone,
I currently have a program which takes an intensity image (See attachment below), and I would like to be able to modify the Brightness, Contrast, and gamma levels of the image using the BCGlookup.vi in the vision development module. However, the intensity data array is a U16 2D array, and the vi to adjust the image settings requires a 1D U8 array.
I was wondering how to convert the array, or if there are any other options to adjust the settings of the image. Thanks in advance for the help!
06-07-2011 10:02 AM
Do you just need to process each row of your 2D array? If so, you can just put the function in a for loop and wire the 2d array to the for loop through an auto index tunnel. In the for loop, you can convert each row to U8 before you feed it into the function.
06-07-2011 10:34 AM
jyang, thank you very much for your response. I believe I found a different solution all together, by converting each 16-bit intensity value into an 8-bit intensity value via an intensity ratio (ie: divide each intensity by the maximum and multiply by 256). The attachment shows the new program.
06-07-2011 01:05 PM
Or you just grab the upper byte through a Split word.
/Y