09-21-2012 01:23 PM
Hi,
I am writing a code to take spectrum from a spectrometer, and I want to have the option of with/without background subtraction.
And I set the array to be all 0 when I choose no background subtraction, then the final spectrum shows all 0.
By monitoring the data, I found it is the 'minus' function that cause the result. It returns an array with all 0 when subtracted the background (all 0 values) from a signal (not all 0).
I simulated the singal using random numbers (with code attached), but the idea is the same.
Can you help me find a solution on this?
Thanks
09-21-2012 01:35 PM
Your are using an array of size sero when you aren't subtracting the background. The array needs to be the same size. Better yet though would simply not do the subtraction when the background should not be removed.
09-21-2012 01:44 PM
Hi Mark,
Thanks for the reply. I am not familiar with LabVIEW. How do I set the size of the array? By initialize?
I also want to ignore the background subtraction when it is not needed, but how can I bypass it?
Thanks
09-21-2012 02:12 PM
To ignore the background subtraction, you can use a case structure. i.e. 'TRUE' = subtraction case, 'FALSE' = no subtraction case.
09-21-2012 02:19 PM - edited 09-21-2012 02:20 PM
Just use a case structure.
The False case only has the wire from the Signal tunnel to the Final Spectrum tunnel (passing straight through).
09-21-2012 03:11 PM
Alternatively, you could build an array with zeroes instead of the random numbers.
Also:
Maybe the attached can give you some ideas.
09-21-2012 03:32 PM
Hi,
Thank all of you guys, this is very helpful.