03-28-2014 04:43 PM
When attempting to implement an FIR Bandpass filter and a Hilbert transformer filter using the FIR compiler I am consistently encountering an issue in which the data scales up to on the order of 10^8. The goal of the FPGA code I am writing is to calculate the instantaneous frequency from this Transform so scaling is definitely an issue. I also noticed that by messing with the fixed point representation word lengths i was able to achieve different scalings, for reasons i do not fully understand. Could someone help my achieve these transforms without scaling the output so severely?
03-31-2014 06:57 PM
Hi jarrisonwins,
Thank you for the post! What hardware are you working with?
As a side note, I would also recommend posting on Xilinx forums if you have not already done so, as you would be likely to find more users of the FIR Compiler on there. While it can be called from LabVIEW, it was primarily developed by Xilinx.