LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

usb4000 integration time error

Hi

 

I've been running into problems setting the integration time for a usb4000 ocean optics spectrometer using a driver I found on the NI website. Basically if I set my program to obtain 5 readings with an integration time of 100 ms, I will obtain the readings in a graph, but if I reset the integration time to 5 ms and run it again, remnants of the old readings remain (or to be more specific: the first reading will look similar to the readings taken with 100 ms) while the remaining 4 readings will look accurate for the specified integration time.

 

I've attached my vi. Any clarification would be much appreciated! 🙂

 

Thanks,

0 Kudos
Message 1 of 5
(2,806 Views)

When you say 100 ms and 5 ms, are you talking about the delay between measurements or the integration time? If you're talking about integration time, the lowest you can get is 1000 ms since you're multiplying an integer by 1000. If you're talking about delay between measurements, then that is possible.

 

From http://www.oceanoptics.com/Products/usb4000.asp

"…an electronic shutter for spectrometer integration times as fast as 3.8 milliseconds…"

 

This means that unless you have the Shutter Mode, you'll be limited by this. If you set the delay between measurements to 5 ms (0.1 on the Front Panel), then you're expecting the integration AND calculations to be performed in 5 ms.

 

The "delay between measurements" will NOT give a delay between measurements. It will wait a maximum of the specified milliseconds and then continue. If you really want a delay (including the length of the measurements), you'll need to ensure it does the calculations AND THEN the waiting time.

 

In general, the LabVIEW call times for VISA will be longer than those specified by the manufacturer, so it is wise to give a little extra time for LabVIEW to do its job.

0 Kudos
Message 2 of 5
(2,793 Views)

So regarding the 100 and 5 ms yes I did mean that they were integration times. The reason I multiply by 1000 is because by default units are set to microseconds so I am basically converting from milisecond to micro second by multiplying.

 

What do you mean by incorporating a delay? The link you posted indicates that this is only an issue if integration times are as low as 3.8 ms, however my lowest is approximately 5/6 ms.

 

Also for incorporating a proper delay how should this be done? You said what I have now will not technically give me a proper delay to ensure that both calculations and integration times happen.

 

Looking forward to hearing back!

 

Thanks!

 

0 Kudos
Message 3 of 5
(2,789 Views)

@AKCanada wrote:

 

So regarding the 100 and 5 ms yes I did mean that they were integration times. The reason I multiply by 1000 is because by default units are set to microseconds so I am basically converting from milisecond to micro second by multiplying.


That's what I suspected


@AKCanada wrote:

 

What do you mean by incorporating a delay? The link you posted indicates that this is only an issue if integration times are as low as 3.8 ms, however my lowest is approximately 5/6 ms.


 

Like I said before the VISA calls in LabVIEW might require more time than the 3.8 ms that is stated. Do you have the issues at 10 ms? At 25 ms? Is there a threshold that needs to be met?


@AKCanada wrote:

 

Also for incorporating a proper delay how should this be done? You said what I have now will not technically give me a proper delay to ensure that both calculations and integration times happen.


Depending on what you mean by "delay between measurements", you could do it your way. I would think "delay between" would mean [perform measurement (however long it takes)] [wait 100 ms (for example)] [perform next measurement] [wait 100 ms] etc. If you mean "delay" in that sense, I'd use a flat sequence within the loop to do the measurements/calculations in one frame and then wait in the next frame. However, if you want the "delay" to be between the start of each measurement, you can keep it the way you have it.

 

I'm not exactly sure how this instrument works, but if "read" and "integrate" can be separated, you could read every 5 ms and then integrate in another loop, taking as much time as it needed. However, if you need to have it integrate every 5 ms, you might not be able to use 5 ms.

0 Kudos
Message 4 of 5
(2,780 Views)

Hi,

 

I think that helps to clarify this a bit. After doing some digging I think what may be the issue is that taking subsequent measurements is inherently not going to work unless a delay of some sort is implemented since the machine cannot respond that fast as you said. In terms of the nature of that delay I think there may be a built in vi which allows for doing so in a synchronized fashion. I will look further into it and see if I can resolve the issue. The delay I was incorporating does not seem to work.

0 Kudos
Message 5 of 5
(2,777 Views)