07-13-2014 11:53 PM
Even if 60 is the dark current value for the detector, you still need to run the electronics for the exposure time to get the CCD array dumped into the buffer. When you say the same code, you are talking about some C code or textural language.
I am fairly sure that your issue is not a labview issue but that your code is not "the exact" same code. You can pass an array to the C code and have it fill the array as you demonstrated. The problem is getting those values from the internal buffer to the LV array.
Just because you are have a dark current offset does not mean the the readout electionics may not have a timing issue.
07-14-2014 01:08 AM - edited 07-14-2014 01:10 AM
@sth wrote:
But even if it was another length int the code would step through the buffer wrong and you would get weird numbers not all values off by 20. If it was really a buffer of bytes and you stepped throught it as int16 then you would get 2 bytes in each value and step off the end of the array.
My guess is a timing issue. He sets the exposure time but does not trigger an exposure or wait the 10 mS for the exposure. For a CCD you need to let the buffer fill before reading it out. I think the buffer does not have what he thinks it has in it.
I don't know the exact hardware, but there seems to be something missing in the program.
I agree that the observed difference is not likely caused by a data type mismatch, but unless you know EXACTLY what you are doing, typecasts are not only ugly but a very strong indication of a problem anyways. So it is always the first step in debugging to get rid of as many typecasts as possible and compile with maximum waning level to see what the compiller has to say about it.
In this case, considering the full numerical range of uint16_t and the small difference that is observed, a timing issue seems the most logical cause. How the OP can keep claiming that his code is EXACTLY equivalent to what he gets when running professional image capture applications is beyond me as I doubt he has any possibility to see the source code to conclude what API calls are made and with what overal timing.
Maybe the call to aperture or setup time setting does have a data type mismatch problem, resulting in a much smaller value the OP thinks. But claiming everything in the way the API is called is right won't make it do what is desired. Debugging consists for instance of things like source level debugging where one breaks into the source code of a debug build and directly can observe the numbers in the variables in the debugger. If that is for whatever reasons not possible, then a well selected number of fprintf() or even the LabVIEW DbgPrintf() calls can quickly show what the C code really gets to see (I woudl be highly surprised if it is different in any way from what is visible on the LabVIEW side after the call.
And screenshots are totally useless to proof that the code in one program is the same than the other. They only show the area where the programmer thinks the problem is, not the whole picture such as how the camera is setup and prepared.
07-15-2014 09:33 AM
Yeah, I agree with what you said. I think I need to further investigate what's happening to camera. Thank you!