03-05-2025 01:08 AM
I am looking to develop a project that can automatically determine the bandwidth occupied by the highest peak signal. I have attached the file I created, but it is not accurately calculating the bandwidth. Could someone please guide me on how to achieve this?
03-05-2025 01:32 AM
Can you show some typical data?
(run the VI until the "baseband power spectrum" contains data, stop the VI, select that graph and do an edit..make selected values default)
Explain your definition of "not accurately". In what way is the result not as expected?
There is a bit of code smell. Hammering the x-scale offset with every iteration is a bit much. Once is probably enough. Properties only should be written when they change. There is a +1 and -2 primitive, etc.
03-05-2025 10:39 PM - edited 03-05-2025 10:58 PM
I am attaching the VI file containing some saved data. The code calculates a bandwidth (BW) of 244 Hz; however, when measured on the spectrum, it shows 226 Hz—and sometimes even 200 Hz. This discrepancy is what I mean by "not accurately." The figure below also illustrates this when calculating the difference (upper limit - lower limit).
I am currently using LabVIEW 2024. In LabVIEW 2017, I had access to the Spectral Measurements Toolkit, which included the SMT Occupied BW VI that automatically calculates the bandwidth of the highest peak signal. However, this toolkit does not work on LabVIEW 2024. Is there an alternative toolkit or VI available for LabVIEW 2024 that performs the same function? I have tried implementing a manual solution, but I am encountering the issues I mentioned earlier.
03-09-2025 08:01 PM - edited 03-09-2025 08:03 PM