10-05-2006 11:39 AM
10-06-2006 07:51 AM
10-06-2006 11:57 PM - edited 10-06-2006 11:57 PM
I'll have to learn more about this point to respond very well, but here goes... I understand how shift registers work, but I have done nothing with tunnels yet. I was going to probably use a stacked sequence structure with shift registers to separate initialization, collection, and closing, but in all of the reading that I've done on these forums (which are FANTASTIC!, btw) - I have read that sequence structures are not considered particularly elegant. I want to make the code very friendly so people who work with this stuff after I'm gone (which won't be long) don't have to start from scratch.
@DFGray wrote:
You are correct. Initializing the scope each time you use it is a significant overhead. Once at the beginning with a close at the end is your best option. Pass the reference through loops with a shift register, not a tunnel. This prevents a copy of the reference being made and ensures that the reference will get through if your loop does not execute for some reason (e.g. FOR loop with zero iterations).
I have multiple points to respond to:
FYI, the NI-SCOPE measurement set includes a waveform average. But waveform averaging brings up a subtle point. The 5114 (and most NI-SCOPE devices) has a high quality time-to-digital converter which reports the time a trigger occurs, rather than the data point on which it occurs (see waveform data or t0 if using waveform output). This gives you subsample timing resolution. A trigger can occur at any time between two points. This is random unless you have synchronized your data source and the scope (you can synchronize in many cases). Averaging waveforms by simply adding them and dividing by the number of waveforms will smear by about a half a sample period. In most cases, this is not an issue. However, if you really want "better" data, you can use the timing information and the resample VIs to shift your data in time, so the trigger points all line up, before averaging. You will probably not need to do this, but you should know the option exists.
Message Edited by zskillz on 10-06-2006 11:59 PM
10-07-2006 02:58 PM
10-09-2006 08:51 AM - edited 10-09-2006 08:51 AM
My apologies for not making myself clear. Taking your points one at a time.
You can calculate a running average several ways, but the easiest is exponential averaging. In this method, the influence of previous data exponentially decays away. You first need to decide what weight to give previous data and what weight to give the current data. For example's sake, let's assume a 75% weight on previous data and a 25% weight on current data. This is sort of like a four sample average. To start, take a set of data and load it into a shift register, which will be your averaged data. On subsequent runs, multiply the shift register data by 0.75 and the new data by 0.25, add together, then display and reload the shift register with the new averaged data. Here is a picture of the idea.
Let me know if you need more clarification
Message Edited by DFGray on 10-09-2006 08:58 AM
10-09-2006 09:02 AM
10-23-2006 12:07 PM
@DFGray wrote:
However, if you really want "better" data, you can use the timing information and the resample VIs to shift your data in time, so the trigger points all line up, before averaging. You will probably not need to do this, but you should know the option exists.
10-24-2006 08:17 AM
This may help. Let me know where I need to fill in the details.
10-24-2006 12:06 PM - edited 10-24-2006 12:06 PM
@DFGray wrote:This may help. Let me know where I need to fill in the details.Good luck.
- Take your first data set and record the relativeInitialX value (a shift register works well). This will be your baseline for future calculations.
- Put the first waveform in your average buffer.
- On each succeeding waveform, find the difference in relativeInitialX between that waveform and the first waveform you took. This value should be less than a sample period, but could be positive or negative.
- Now use some sort of interpolation scheme to find the data at each point relative to the data you already have. Off the top of my head, I can think of three ways to do this, listed in easiest to hardest, slowest to fastest (I think)
- Use the Align and Resample Express VI. This will not be available if you have the base version of LabVIEW.
- Use FFTs to shift your data. To do this, take the FFT of your data. Shift the phase of each element the correct amount to move the waveform. This will be a different amount for each frequency bin, since the frequency is linearly changing. Take the inverse FFT. Since you are moving less than one sample width, this works very well. However, I would pad the ends by at least 10 extra points each to take care of any ringing artifacts.
- Use a Savitzky-Golay filter to interpolate your data. You can find details on the Savitzky-Golay filter in Numerical Recipes in C by Press et. al. It is available in most libraries.
- Average the result into your average buffer as before.
Message Edited by zskillz on 10-24-2006 12:07 PM
Message Edited by zskillz on 10-24-2006 12:09 PM
10-25-2006 09:39 AM