05-16-2016 10:26 AM
Hello, I have problem with phase difference caused by interchannel delay. I am measuring voltage by two channels using NI-USB 6211 (in attachment) and because it has multiplexed inputs I have problems with phase difference between two channels. I heard that somehow (by some LabView function) I can solve this problem very easily. Can you tell me how?
Thanks a lot
Dalibor
05-16-2016 02:41 PM
If you just need to measure the phase delay:
Apply one (sine) signal on both channels, use tone detection and the difference is your phase delay which you can use to calculate the groupe delay (test it with more frequencies 😉 ) . For one samplerate&channel configuration this should be constant . You can use that value(s) to correct the phase measurement.
If you want both channels 'time delay' corrected, you can design a filter to shift one or both signals .. or use the FFT , shift the phase and invFTT
If you only want to display them time corrected, measure the group delay and correct the X0 value of one channel 😉
05-16-2016 02:54 PM - edited 05-16-2016 02:55 PM
What is the best solution for my case - I am measuring impedance characteristic, it means that I have to change frequency of sine wave, here I think I can't use the tone detection.
05-17-2016 09:01 AM - edited 05-17-2016 09:04 AM
No, you measure a transferfunction 😉 maybe, due to impedances, in a certain shape 😄
Exitation is always a sine (or multisine) function?
If yes, tone detection is a very nice tool to measure amplitude and phase.
Apply one sine signal at both channels, measure the phase difference , calculate the interchannel delay , for future measurements use that value to correct the phase.
BUT be aware of channel crosstalk!!
And if you measure twice with swaped channels, you can correct most channel errors (except nonlinearities) ... (OK, crosstalk ... I don't know 😞 )
05-17-2016 10:55 AM
@stys_dali wrote:What is the best solution for my case - I am measuring impedance characteristic, it means that I have to change frequency of sine wave, here I think I can't use the tone detection.
You stated that you want to measure impedance. If I take an educated guess at your test configuration, you will have to deal with more than a phase difference between signals. You will also have some sort of amplitude difference.
The most basic way to get impedance is to use Ohms law (R = V/I). Your instrument is measuring two voltages, so you need to convert the actual current (I) into a voltage using some sort of current transducer. It could be a clamp on current probe, series resistor, etc. That transducer will introduce its own transfer function of amplitude and phase onto your signal.
One way to calibrate your test system is with a known set of standards. In your case, it may be as simple as a (non-inductive) resistor of a know value (like 100 ohms). Inject a sine wave at one frequency across the resistor. Simultaneously measure the voltage across the resistor with one channel, and the voltage produced by the current transducer with the other channel. Dividing one channel by the other (V/I) will give you the uncorrected impedance (Magnitude/Phase). Your ideal resistor should measure 100 ohms and 0 deg phase. More than likely, your result will not be 100 ohm. You can then find the correction factor by dividing the known standard value (100 ohms) by your result. You would need to repeat this process for each frequency of interest. Save these calibration factors and use them to correct future measurements.
05-18-2016 01:51 AM
Have a look at the shipped examples ... Baseband Frequency Response (DAQmx)