Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

why am I getting the Error -200479

I am having a analog signal input (frequency of 20 Khz) that I have to acquire.The signal is from a demultiplexer unit that I plan to test.The  test involves sending a digital signal from  4 pins of the Digital output  DAQ to the Demultiplexer circuit which has a switching speed of 1 MHz.So this digital output from the DAQ will have frequency of 1MHz(ie 1 microsecond pulse train).Hence for every 1 microsecond the DAQ should be able to send a Digital code(any combination of 4 bits mentioned) and acquire the analog input signal (frequency of 20 Khz).So now I am planning to use the implicit timing mode for the sample clock to clock the signal which is not working too.

I am trying to

I am giving the details of my DAQ : version USB 6259

Speed : 1.25MS/s

So finally I have to design a syatem which will be able to acquire an analog signal of 20Khz(maximum frequency) , after sending a 1 MHZ digital signal to the chip without the use of external clock.So please provide me with the ideas that are possible.

I have also attached my Vi with this post

REgards,

Pradeep.

 

Download All
0 Kudos
Message 1 of 4
(3,533 Views)

You've got a decent start, but there are a number of details to address.  Here are some things I notice in no particular order:

 

1. You only wrote a single digital sample to your DO buffer.   You set the buffer size to 1 million -- why aren't you filling it?

 

2. At least for early testing, you should also call a DAQmx Write Property node and set the regeneration mode to "allow regeneration." on your DO task.

 

3. An analogous problem is found in your main loop where you are trying to read 1 AI sample at a time and graph it.  You'll never keep up at 1 MHz sampling rate!  For early testing, make a 1 million. sample buffer, set the AI task to be finite sampling, and just read the data all at once after all 1 million samples have been taken.  You can graph it if you want, but remember that your graph can only show about 1000 pixels at a time.

 

4. You'll probably eventually want to run the DO task on the leading edge of the pulse train and the AI task on the trailing edge.  You may also want to specify a non-default duty cycle too.

 

5. On the good side, you've got the right sequence for starting the DAQmx tasks.  Both the AI and DO are started and ready before you generate the pulse train that acts as a sample clock.

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 2 of 4
(3,500 Views)

 

I am trying to test my IC to which I have to give a digital otuput on 4 lines using the DAQ.Also the DAQ must be able to acquire the signals from the chip using a single "analog input" channel.Both the digital output and the analog input channels have to work synchronously so that whenever the digital output is given out, the analog input has take the signal in.
So here there will be 16 channels from which I will have to select one channel at time using the digital output from the DAQ.After this the analog input will acquire the signal at its pin.For thisputrpose , the digital output and the analog input will have to be sychronisedat the same time.sO A COUNTER WAS USED TO CLOCK BOTH THE analog input and the digital output.The counter is indeed clocked by the internal clock by setting the clock source of counter to "IMPLICIT".
Now here is the problem , at the execution of this VI,the digital output's "Start Task" gives out an error like the buffer has to be increased in size,so I configured the buffer size to be 20000 before the digital write step.But after this the error is "Specified operation cannot be performed while the task is running." This the current problem.It happens at the execution of the "Start Task" of the  digital output of the VI.
Also the clock has to be 1 microseconds.

 

I am trying to test my IC to which I have to give a digital otuput on 4 lines using the DAQ.Also the DAQ must be able to acquire the signals from the chip using a single "analog input" channel.Both the digital output and the analog input channels have to work synchronously so that whenever the digital output is given out, the analog input has take the signal in.So here there will be 16 channels from which I will have to select one channel at time using the digital output from the DAQ.After this the analog input will acquire the signal at its pin.For thisputrpose , the digital output and the analog input will have to be sychronisedat the same time.sO A COUNTER WAS USED TO CLOCK BOTH THE analog input and the digital output.The counter is indeed clocked by the internal clock by setting the clock source of counter to "IMPLICIT".
Now here is the problem , at the execution of this VI,the digital output's "Start Task" gives out an error like the buffer has to be increased in size,so I configured the buffer size to be 20000 before the digital write step.But after this the error is "Specified operation cannot be performed while the task is running." This the current problem.It happens at the execution of the "Start Task" of the  digital output of the VI.
Also the clock has to be 1 microseconds.

Regards,

Pradeep

0 Kudos
Message 3 of 4
(3,490 Views)

Here, these are the changes I was talking about.  I don't have hardware to test with, but this will at least get you much closer.  Look closely to see what the differences are so you can find out what you need to learn about.

 

(Note: I don't have one of the subvi's so this came up non-executable for me when I opened it and saved it.  Hopefully, the saved version will find that sub-vi on your machine.)

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 4 of 4
(3,464 Views)