LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Error -200290 occurred need help with project!


I'm busy with a project where I'm making an automatic guitar tuner. The tuner works great without issues.I'm busy  right now with optimizing and adding a few small things here and there.

One of these things is where I make the boolean LEDs light up in a order and turn off again. This happens when the footswitch is pressed. It works as it should but I immediately get the following error. 

"Error -200290 occurred at Gitaartuner.vi:Instance:12:830001

Possible reason(s):

The generation has stopped to prevent the regeneration of old samples. Your application was unable to write samples to the background buffer fast enough to prevent old samples from being regenerated.
To avoid this error, you can do any of the following:
1. Increase the size of the background buffer by configuring the buffer.
2. Increase the number of samples you write each time you invoke a write operation.
3. Write samples more often.
4. Reduce the sample rate.
5. If your data transfer method is interrupts, try using DMA or USB Bulk.
6. Reduce the number of applications your computer is executing concurrently.
In addition, if you do not need to write every sample that is generated, you can configure the regeneration mode to allow regeneration, and then use the Position and Offset attributes to write the desired samples.

Task Name: _unnamedTask<3>"

If I remove the cases where there are delays the issue seems to be solved but I need them to work with the tuner. I also tried playing with the values of the sample rate and the amount of samples but nothing seems to be working. I also prechecked the forum here but no luck so far. Does anyone know how I could solve this issue without removing the "delay cases" from the VI?

Please keep in mind that the VI doesn't look very great right now the added limits are also not correct yet these all need to be changed/corrected but these values do not affect the basic operation of the tuner so for now this can be ignored and is not the culprit of the issue.

You can find all the files in the attachment below and a screenshot of the error I was mentioning in a .zip file.

I appreciate all the help and feedback!

0 Kudos
Message 1 of 4
(747 Views)

I'm not understanding the point of your delays.  Why not just write the analysis results straight to the boolean indicators?  A wait right before updating the indicator just seems weird to me.

 

Generally, a loop with DAQ should (almost) never have a delay in it.  The DAQ will limit the loop rate.  When you add in your own delays, you get the buffer overflow errors you are seeing.  The buffer overflow means you are not getting data fast enough from your DAQ and the FIFOs have filled up.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 4
(699 Views)

Thank you for your respone!

So using those delays are the cause of those errors I understand now. The delays are basically for like a "start up". When a user presses the footswitch on the pedal all the boolean indicators should light up in a specific order and go off again in a particular order before going to the "idle" state. There is a video in the .zip I attached I will put the video in this comments attachment aswell. I thought I could only manage this type of behaviour with the boolean indicators using those delays. Is there any other way to achieve this by any chance? I tried other methods but nothing is working for me so far. I only manage to get the indicators to light up at the same time which is exactly what I don't want 😄

I hope you or anyone else could help me with this.

Any feedback or help is very much appreciated 🙂

0 Kudos
Message 3 of 4
(692 Views)

Can you "save for previous version" and choose LabVIEW 2019 or 2021?  [Even 2020 would work].

 

If you are doing continuous data acquisition, you should probably be using a Producer/Consumer Design Pattern, where the Producer is a While Loop that is "clocked" by a single DAQmx Read (no functions from the Timing Palette) which runs at the rate set by DAQmx.  For example, if you are taking analog samples "continuously" with 1000 samples at 1 kHz, exactly once a second, the DAQmx Write will deliver an Array or Waveform (however you specified the Read) to you.  What you do with these data is put it in a Queue or a Stream Channel Wire and get it out of the Producer Loop, so that the Loop will sit "idle" for 999 ms and will 1 second later deliver yet another Sample.

 

The Queue/Stream goes to another Loop, the Consumer Loop, running in parallel (unless you connect a wire from the output of the Producer to the Input of the Consumer -- don't do this!).  It sits idle, waiting for data to be delivered to the Dequeue Element (or Stream Reader) inside the Consumer Loop.  Here you plot, write to disk, or whatever you want to do with the data.  Because the Queue/Channel Wire is somewhat "elastic", as long as you can process the data so that the 1000 points are handled (on average) in less than a second, you'll be fine.

 

By doing this, you take two processes that need to run once/second and have them run in parallel.  The Producer Loop basically takes no CPU time (it's 99.99% done in the DAQ hardware) until all the data are present, then its a quick write of data to disk, while the Consumer Loop does (in computer memory and software timing) whatever processing, graphing, disk I/O is needed, and can use the otherwise-idle 999 msec that the Producer isn't using.

 

Bob Schor

0 Kudos
Message 4 of 4
(657 Views)