11-04-2011 09:32 AM
HI all,
i have a big problem which consist in getting accurate result from this VI, actually this vi allows me to calculate DNL for ADC based on histogram testing, the formulas used to get DNL result are joined below with the VI.
so my questions are :
1- is there a specific input frequence/ADC sampling frequency/input offset value/amplitude, for each resolution of ADC ??
2- what is the relation between FSR,amplitude of the INPUT signaL and the OFFSET OF input signal??
2- how can i choose these parameter to get accurate DNL result for 12bit ADC for exmple??
please if you have any suggestion that we can discuss, don't hesitate to respond my request,,i'll be greatfull if anyone could verify my vi.
Regards,
11-04-2011 10:14 AM
Do you get paid by the hectare? The style guides recommend keeping the size of panels and block diagrams to one screen. If you or anyone trying to help you has to scroll all over the countryside looking at the diagrams, it is very difficult to tell what a program is doing.
now to your questions:
1- The charactersictics of the test signals depend very strongly on the specifications of the ADC. The signal amplitude needs to cover all possible values of the input range so that all possible output codes are generated. The test signal frequency depends on the specified bandwidth of the ADC and the sampling frequency. The quality of the test signal should be significantly better than the expected characteristics of the ADC.
2- As mentioned above the input signal must span the full scale range to completely test the ADC. It may need to go slightly beyond the nominal range to allow for variations in the device.
2- (again) 3?- I suggest that you look at the data available from several ADC manufacturers. They report the methods used to test their devices. Also the IEEE Transactions on Instrumentation and Measurement has published numerous articles on testing ADC in recent years.
Lynn
11-04-2011 10:40 AM
thank you johnsold for your quick reponse,
1-in fact i'm reffering to the 1421IEEE document in which i've found those formulas,but all i need is an example(the information given are very general)
2-you're wright the vi is too long i may use sub vi to make it clear thank you for noticing it,,
3- please it would be great if you explain me more how the input signal with an amplitude "A" will be adapted to the full scale of an ADC : if the amplitude exceeds the FSR for example a signal with 5 volts of amplitude and an FSR= 3V in this case the FSR can't reach 5 volts so the ADC can't get the sample of analog input corresponding to 5 volts and code it,, but what you mentioned is the opposite of this idea how can this be possible (please can you explain more with an example??)
thank you ,,
11-04-2011 10:43 PM
1- Please provide the specifications for one ADC which you might wish to test. There are so many that I might choose an example very irrelevant to you.
3- Take the example of an ADC with FSR = 3 V. If the device were perfect, a signal which covers 0 to 3 volts would evoke every possible output code. However, suppose the actual device did not produce the maximum code until the input reached 3.05 V. Then a signal from 0 to 3 V does not test every code for that particular device. So, the signal is selected to be sufficiently large to cover the expected tolerance about the nominal range. If you apply a 5 V signal to a 3 V device, it is possible that the excessive signal could damage the device or produce anomalous behavior and disrupt the measurement. Some types of opamps can reverse their outputs if the inputs exceed the specified common mode range by more than 0.6 V.
Lynn
11-08-2011 03:03 AM
Hi johnsold,
Thank you for your explanations, it is now clear: so if we want to invoke all ADC's possible code we must use an input wich is greater than FSR (considering the tolerance),
1-you talked about a signal which covers 0 to 3 volts is that about peak to peak voltage (does that mean that the amplitude of the signal is 3 volts with 0v offset so the peak to peak equals to 6 volts or the amplitude is is 1.5 with offset =1.5v so peak to peak voltage=3volts) : the ADC sees the input voltage as peak to peak and then adapt it to its FSR??
2-there is the specifications of the ADC which i'll use :
it is the stm32 's ADC
a)Vdda(power supply) min 2.4 max 3.6 (volts)
b) vref+ min 2.4 max Vdda (volts)
c)ADC clock frequency min 0.6 max 12 (MHZ)
d) sampling rate min 0.05 max 1 (MHZ)
thank you johnsold for your help it matters a lot to me,
11-08-2011 09:14 AM
hello all
i've joined an improved version for my work(calculating DNL/INL based on HISTOGRAM)
my apologies for the first version
any help is greatly appreciated 🙂
11-08-2011 07:27 PM
Several subVIs are missing so I cannot run the histogram VI.
The ADC only looks at one voltage at a time, so it does not care how you define your signal. If the ADC input range is 0 to 3 volts, then your signal must produce every voltage (with better than 1 LSB resolution) within the 0 to 3 volt range over the time of the test. The typical description of this would be amplitude 1.5 V, offset 1.5 V, producing peak to peak 3 V. For a 12-bit converter with a 3 volt range the resolution is 732 uV.
Lynn
11-09-2011 02:56 AM
Hello johnsold,
- Thank you again for these explanation, this is very helpfull for me 🙂
- I've made some corrections in my work(spent all night working on it ) so these are sub VIs with the main VI joined below,,
thank you for taking a look on it (i've tried my possible to make it as clear as possible so you can understand it easely).
*the problem with this VI is that it works with 3 bit resolution(i've tried the default values(Amplitude=2 Volt Offset=2.048volt Vref=4.096) with 3 bit of resolution and results seemed to be coherent) but when changing the resolution to 12 bits (with the same values used with 3 bits of resolution) the results are incoherent!
Best Regards,
11-09-2011 02:58 AM
and here is the sub VI for making histogram.
11-09-2011 01:05 PM
Hi samiti,
Could you explain what you mean by incoherent? I was able to get your program to run using a 12-bit resolution and received a histogram that looked alright after setting the X axis and the Y axis to autoscale on the histogram plot. However, during the INL part of the program when running on my end Labview runs out of memory during the INL section at the end of your code. Are you experiencing similar behavior?