LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to convert a 2D array

Hey everyone,

 

I currently have a program which takes an intensity image (See attachment below), and I would like to be able to modify the Brightness, Contrast, and gamma levels of the image using the BCGlookup.vi in the vision development module. However, the intensity data array is a U16 2D array, and the vi to adjust the image settings requires a 1D U8 array.

 

I was wondering how to convert the array, or if there are any other options to adjust the settings of the image. Thanks in advance for the help!

0 Kudos
Message 1 of 4
(2,528 Views)

Do you just need to process each row of your 2D array?  If so, you can just put the function in a for loop and wire the 2d array to the for loop through an auto index tunnel.  In the for loop, you can convert each row to U8 before you feed it into the function.

------------------------------------------------------------------

Kudos and Accepted as Solution are welcome!
Message 2 of 4
(2,512 Views)

jyang, thank you very much for your response. I believe I found a different solution all together, by converting each 16-bit intensity value into an 8-bit intensity value via an intensity ratio (ie: divide each intensity by the maximum and multiply by 256). The attachment shows the new program.

0 Kudos
Message 3 of 4
(2,507 Views)

Or you just grab the upper byte through a Split word.

 

/Y 

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 4 of 4
(2,491 Views)