Hi,
I prepared LabVIEW code for a retriggering approach using APFI0 (analog hardware trigger) with a trigger level of 1.0 volt, a frequency of 360 Hz, and a sampling rate of 48 kHz (the maximum for multichannel AI on the NI 6255 Mass Termination) in finite mode, with 2 samples per channel. However, I am facing a delay issue: this code should complete 360 iterations and input 360 Hz in around 1 to 1.7 seconds, but currently, it takes 5.5 to 6.0 seconds to complete. I tried using the DAQ Assistant, but it was not working, so I switched to a DAQ task, which is running but still taking the same amount of time.
I then prepared Python code for the same approach, which runs in about one second. However, I am concerned because when I do not connect the input signals and only provide the trigger signal of the same amplitude and frequency, it should trigger and show random values. Instead, it waits until the timeout reads 10 seconds. In contrast, when I run LabVIEW under the same conditions, it shows random signals, indicating it is reading the trigger signal.
Please guide me on what mistakes I might be making in my approach that are causing continuous delays in LabVIEW, while the Python code meets the time requirement but fails to read the trigger signal when the input channels are disconnected. Is it possible that the Python code is unable to scan/read the APFI0 trigger signal?
Kind regards
Hasham
Kind regards
Hasham