02-16-2015 12:54 PM
Hello all NI expects:
We has a NI-PCI5154 used for a few years now. We use it to capture pulse waveforms which we care about their timing relations.
We operate the digitizer at 1GHz sample rate and till today we assume the sample rate is accurate and constant. Today one group member doubt this since the digitizer specfication says the timebase is has a drift about "±7 ppm per °C". So if this is true, assume we have a operating temperation which is 20 degrees higher than the temperature at which the digitizer was calibrated, then the drift can reach up to 140 ppm times 1GHz which is 140 KHz?! This would be a killer of our measurements.
Please help to clarify this issue so we can estimate the errors in our measurements.
Solved! Go to Solution.
02-16-2015 04:35 PM
The PCI-5154 has a Timebase Accuracy of +-30ppm, and then +-7ppm per degC. So if you are operating at 20 degC above the External Calibration temperature, then you have a total of 170ppm, not 140ppm. Also, to see the effect of ppm on your signal, you actually multiply the ppm by the frequency of the signal you are acquiring.
So, if your input signal is 10MHz, then your measured frequency will be 10MHz * 170ppm = 10MHz +-1.7kHz
If your input frequency is 1MHz, then 1MHz * 170ppm = 1MHz +- 170Hz
So, the impact of the accuracy of the measurements depend on the frequency of your input signals. The reason for this effect can be seen if you look at the actual time between samples. At 170ppm, the 1GHz clock will have a period somewhere between 0.99983ns-1.00017ns (1GHz +-170kHz). At 1.00017ns of actual period, in software we would represent this as 1ns, and thus compress time. This will compression will increase the frequency of the measured signal. The opposite will occur for the 0.99983ns period. Thus for each period of the timebase, it will be off at most by +-0.00017ns. Lower frequencies are affected less by this effect.
Lastly, timebase drift does not jitter very much, so you are not going to see measurements that are +170ppm off on one sample, and then -170ppm off on the next sample, but it theoretically could over a longer period of time (undefined). You could characterize your system and measure the actual ppm your measurements are experiencing, and then use that info to adjust your measurements. We call this a system level calibration, and it would require a HIGHLY accurate signal source to work well.
Anyway, I hope this helps.
-Nathan
02-17-2015 10:29 AM
Thanks, Nathan.
It's my fault that I didn't look into this issue more seriously before. Here I have a few questions in the following:
1, Do you know the drift direction of the clock? When I have a 30 degree higher operation temperature, at 1 GHz, the period is 0.99983ns or 1.00017ns?
2, Does a soft calibration at the operation temperature help?
3, So if the operation temperature changes over the time, does the timebase drift accordingly and predictable?
4,If we can find a more stable frequency generater to use as a external clock for the digitizer, does the frequency generater has to be 1 GHz in order for the digitizer to sample at 1 GHz?
Thanks again for your help.
02-17-2015 10:52 AM - edited 02-17-2015 10:58 AM
1, Do you know the drift direction of the clock? When I have a 30 degree higher operation temperature, at 1 GHz, the period is 0.99983ns or 1.00017ns?
No, the drift is variable depending on the oscillator chip, and we just spec the absolute worst it could be. Most times the ppm will be better, but that is not guarenteed.
Also, if your 30 degrees higher, then your ppm would be 240ppm, and thus the 1GHz period would be:
1GHz * 240ppm = 1GHz +/- 240kHz = 1.00024GHz - 0.99976GHz = 0.99976ns - 1.00024ns
In your original post, you only calculated timebase drift error, but you need to add timebase accuracy AND timebase drift together.
2, Does a soft calibration at the operation temperature help?
Self-Calibration won't affect the timebase accuracy, but it does significantly help with vertical accuracy. In fact, our DC accuracy specification is based off the assumption that your operating within 3degC of the temperature of the last self-calibration.
3, So if the operation temperature changes over the time, does the timebase drift accordingly and predictable?
Again, this will vary from chip to chip, but the error magnitude will increase with temperature.
4,If we can find a more stable frequency generater to use as a external clock for the digitizer, does the frequency generater has to be 1 GHz in order for the digitizer to sample at 1 GHz?
This information can be found in the specifications documentation for your device.
Specifications Document:
http://www.ni.com/pdf/manuals/375021b.pdf
Basically you can either supply a sample clock directly, which then would need to be the sample rate you want to run at (1GHz). You can also provide a reference clock, which can be anywere from 1 to 20MHz in frequency. See page 13. The onboard sample clock will PLL lock to the reference clock, and so if the reference clock has 100ppb accuracy, then this accuracy will also translate to the sample clock through the PLL.
I hope this helps.
Nathan
02-17-2015 11:52 AM
Thanks, Nathan.
So it looks like the only way to get the actual timebase drift is to do a timebase measurement every time we use the digitizer? I mean if the digitizer could have a different timebase drift even at the same operation temperature at different time.
Our measurement is very sensitive to the exact clock peroid. We need the know the clock period within 0.02% at least. I think we need to correct our results with the actual timebase drift. So how to measure the exact sample clock frequency drift?
Thanks a lot.
02-17-2015 01:05 PM - edited 02-17-2015 01:09 PM
What kind of measurements are you taking? Why are they so sensitive to the timebase accuracy? With .02% accuracy, that is 200ppm, which we are below when operating within 20degC of External Cal temp.
To measure the timebase accuracy, you will need to provide a very accurate sine tone with a known accuracy. I use a SMA100A for example. Then, in LabVIEW you can use the following example code: http://www.ni.com/example/30792/en/.
Another way I just recently found of performing this measurement can be found here, though I have not yet tried it: The Beat method
Regards,
Nathan
02-17-2015 02:20 PM
Thanks, Nathan.
We capture some waveforms from a signal source to determined the lifetime of the source. We aim to get the precision of 0.02%. We are able to measure the source lifetime at the precision of 0.02% with analog instruments. We had wished we could improve the precision with high speed digitizer measurements.
What I really care is to know the actual period when the digitizer operated at 1 GHz so I could make the corrections to the measured lifetimes.
Again, does the timebase clock drift need to be measured/calibrated every time we take a real measurement? We the hope to get a 0.02% precision of lifetime measurements we wish to limited the error contribution by digitizer timebase drift as small as possible.
02-17-2015 03:58 PM
Unfortunately we don't have any data on the repeatability of the timebase drift.
To calculate the actual timebase frequency, you just need to reverse the calculations we discussed so far. Measure a highly accurate source on the digitizer, and any frequency shifts in the signal would be caused by the non-ideal timebase period.
For example, you measure a 10MHz signal at 1GHz, and its frequency is reported as 10.001MHz. Thus we are off by 1kHz. 1kHz = 10MHz * Xppm, solve for X: X= 100ppm. Thus our sample clock is running at 100ppm. 1GHz * 100ppm gives us a period of 0.9999 ns or 1.0001 ns. Since our frequency was increased by 1kHz, the signal was compressed when being interpreted at 1ns dt. Thus the actual clock period was 1.0001ns.
Since it sounds like you can't control the temperature of the enviroment your working in, for the most accurate measurements you would want to measure the timebase clock drift immediately before/after taking your measurements. If you did run your tests in a controlled temperature environment, you might be able to get away with not measuring the timebase clock drift as frequently, but you would still need to run it regularly. The reason for this is also because of the aging effects of the timebase oscillator (affects all oscillators). The accuracy of all oscillators gradually drift or increase over time. Our specifications account for this drift within the external calibration interval, but if your going to measure actual accuracy, time is another factor that will affect the timebase accuracy.
For sake of completness, I also need to say this, that when you measure the ppm accuracy of your test signals, this denotes absolute accuracy, not just the accuracy of the timebase but also the accuracy of the signal source. So its very important to have a highly accurate source for the test signal.
I hope this helps,
Nathan
02-18-2015 11:18 AM
Thanks, Nathan.
02-26-2015 02:02 PM
From LCHEN5154:
"I did measured our NI5154 digitizer's timebase drift and the result is about 2PPM drift per celsius degree. Sounds normal.
My last concern is is there any other operation conditions may affect the timebase? For example how busy is the digitizer is capturing and saving pulse waveforms?"
The Clocking engine is separate than the acquisition engine, and all the other subsystems really. So the timebase is not affected by other conditions.
Regards,
Nathan