LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQmx Esoteric Properties Help

I am struggling with some of the less likely used DAQmx properties. Some background:

  1. My problems arise in the "log" only case, not "log and read" case.
  2. I can't post the original code here (Apologies in advance to Bob Schor), but there is a sample code attached that sometimes mimics the problem.

Here is a summary of the problem that typically occurs at higher sample rates, 4 channels or more 1 MSa/s.

 

I manually set the buffer size and the file write size for DAQmx Input.BufferSize property node, the DAQmx Logging.FileWriteSize and the DAQmx Logging.Samples Per File node. I always set the FileWriteSize node to an evenly divisible number of the File Sector Size along with the File Write Size. However for some cases of acquisition times, File Write Size Values and Samples per file, the logging just freezes with out any errors. That is, no errors pop up, but the logging just stops. It also does not stop at the same place all the time. Has anyone have an example of how to set the optimum values for these properties? For the example attached the values appear to work, for a NI 9223 module. However how did LabVIEW calculate the optimal FileWriteSize?

 

Also I needed to adjust the DAQmx Channel Properties to get High Speed Streaming to work.  (UsbXferReqsize and count ) How are the optimal properties calculated here? I just guessed.

 

Lastly, do the properties need to be set in a particular order? That is, first buffer, then Samples per file, then etc?

 

Thanks for any advice or examples.

 

Cheers,

mcduff

 

0 Kudos
Message 1 of 8
(3,154 Views)

I have no (serious) complaints about the code you posted -- thanks for posting code.  You are operating at a faster data rate, and "pushing the system" harder than I, so I'm not sure I can make helpful suggestions.  I've also not saved to TDMS -- I prefer doing a DAQmx Read, putting the data into a Queue, and (in parallel) streaming to a binary file (but then I'm usually talking about 24 or so channels at 1KHz, trivial compared to your data rates).

 

Bob Schor

0 Kudos
Message 2 of 8
(3,119 Views)

Thanks. The problem is the help is lacking and so are the examples that use these properties.

 

I like the logging because it is the most memory and cpu efficient means of saving the data. This allows the user to use mid to low-end computers for data acquisition, so I would like to get it working consistently.

 

Cheers,

mcduff

 

EDIT The code I downloaded and made a few additions to; I admit not my best work. The real code I am not allowed to share.

0 Kudos
Message 3 of 8
(3,116 Views)

McDuff

You always seem to come up with interesting stuff to dig into.

 

In this case I'm assuming you are using a 9171 usb chassis with the 9223.  USB (Self powered) is capable of those speeds IF you do some really cool stuff inside everything to make it smarter than a COTS solution (LINK)

 

Now that super duper speed needs some things like (oh just for grins) onboard fifos that can make it work.  Whoops, those 127s/ch fifos on the 9223 aren't going to sustain 4Ms/Sec throughput.  that X series  should be able to make it go on the high end devices with them 4or8Ms/ch fifos and the right tech in the host.

 

You are never going to make a silk purse out of a sow's ear so "This allows the user to use mid to low-end computers for data acquisition" is a pipe dream at best.

 


"Should be" isn't "Is" -Jay
0 Kudos
Message 4 of 8
(3,098 Views)

Jeff I was hoping that you had used those functions before.

 

The thing is, I had it working before for 8 channels 2MSa/s on a 10 year old laptop. (It was USB X 6366 I think).

 

But here is the catch, ...

To get to it to work I set the buffer size to 4 times the number of samples per file and set the file write size to the number of samples per file. Why is this a problem, well increasing the buffer size increases the memory the program uses. It works for 10 second files, but I want to give my users the option of 1000 second files, even if I don't agree with it. Well, there is not enough memory for a buffer for four times a 1000 second file. So, I want a smaller buffer with smaller write sizes, so I can have my 1000+ second files.

 

Actually, I found some of your old code, good stuff, for finding divisors of integers. As of now, I will try some combination where all of my numbers are multiples of each other along with being a multiple of the disk sector size. I may even make that UsbXferReqsize a multiple of all those numbers. I know I am close as it works for some combinations of sampling rates, acquisition times, etc, but not for others. (My test case is a 10 year old laptop.)

 

cheers,

mcduff

0 Kudos
Message 5 of 8
(3,086 Views)

Yup the X series 6366 is good for 16Ms/Sec throughput.  the 6356 should get you the 4ch at 1Ms/Sec just fine  the 9232 just won't get you there 

 

And hey, do me a solid a post the link to that old example I forgot to tag it and it got lost in 10kposts


"Should be" isn't "Is" -Jay
0 Kudos
Message 6 of 8
(3,076 Views)
Message 7 of 8
(3,065 Views)

Update:

 

Here is some more information that I found, in case anyone is interested.

 

http://forums.ni.com/t5/forums/v3_1/forumtopicpage/board-id/250/thread-id/52229/page/3

 

Surprisingly, the best file write size is rather small, 2048B, according to this post. I was doing the exact opposite, I though bigger file write sizes spread out would be better. Still need to test this though. (So far smaller write sizes are better, still do not understand why.)

 

Attached is a presentation that I downloaded a while back, can't remember where, that also goes into this discussion. Not sure why it says NI Confidential on it, NI if you are listening, you can remove it if you want.

 

Cheers,

mcduff

0 Kudos
Message 8 of 8
(3,040 Views)