09-09-2019 06:11 PM
Dear experts,
I am currently working on a image reconstruction project using the 7962R FPGA + NI 5734. I am using Labview 2017 (32 bits). I am good at VERILOG coding but it is actually my first time using Labview FPGA.
Short description:
I use Example (NI 5734 - Getting Started) to test NI 5734, but it doesn't work ideally.
Four photos have listed to show what happened. All input signal are generated by Function Waveform Generator (RIGOL DG4162).
PHOTO ONE:
Input signal is 0.5v, 100kHz sin wave.
Displayed waveform is 1.2v, 100k sin wave, nearly no distortion.
PHOTO TWO:
Input signal is 0.5v, 10kHz sin wave.
Displayed waveform is 0.6v, 10kHz sin wave, nearly no distortion. I noticed the waveform voltage is 50% compared to Photo One.
Photo Three:
Input signal is 0.5v, 1kHz sin wave.
Displayed waveform is 0.08v, 1kHz sin wave, and the noise is large. It is worth noting that the waveform voltage is only 7% compared to Photo One.
Photo Four:
Input signal is 0.5v, 100Hz sin wave.
Displayed waveform is a noise-filled sin wave. It's terrible!
The Problem:
Why does this happen: as the frequency decreases, the signal amplitude gradually decays. I am certainly sure the input signal displayed clearly and correctly when it connected to osciloscope.
Is Ni 5734 broken? How can I check it?
Thank you very much!
Tao Zhou
09-09-2019 07:23 PM
I don't have that example to look at, but I wonder if there is any averaging or filtering going on. Is there an obvious place in the example to set this?
If not, can you set your function generator to output an offset sine wave and see if the offset comes through unattenuated?
09-10-2019 08:06 AM - edited 09-10-2019 08:07 AM
This is the
09-18-2019 01:52 PM
Thank you! I have solved this problem which is because of the wrong coupling method.
09-18-2019 03:36 PM
Great, thank you for the update!