07-31-2015 01:27 PM
I need to read 15 sensors 10,000 times per second and log all the data into a tab-delimited file. The code is running on a cRIO (NI 9082). Data is being read on the FPGA level off three NI 9220 analog voltage modules and logged at the RT level. I'm writing a GPS timestamp plus 15x 24-bit FXP values. The tests will run as long as 30 minutes so it's a boatload of data, something like 2.5 GB in a full test.
I've spent a lot of time trying to figure this out on my own and all I have now is a system that bogs down the cRIO , messing up other processes, and loses about 60% of the data points. I'm using a FIFO to get the data from the FPGA to the RT, but the RT logger can't write all the data to disk even with buffers designed to keep me from losing data. Before continuing my half-cocked efforts, I thought I'd ask advice from someone more experienced.
If you were designing a system like this from the ground up, how would you do it? Is it even possible to log this much data this fast?
Solved! Go to Solution.
07-31-2015 01:33 PM
how many bits per data point?
07-31-2015 01:36 PM
Never mind, I see... I was thinking DMA to your logging computer would probably be able to handle it.
07-31-2015 01:42 PM
Also, consider that you probably don't have to process your data until after the test is complete. Converting a boat load of data into a csv can take a lot of energy from a cpu.
07-31-2015 02:21 PM
For that matter, have you considered using TDMS format? Seems like a perfect use case. Representing that much data in ASCII format adds a ton of overhead so you should avoid it if you can.
07-31-2015 02:29 PM
I would save the data to a TDMS file. It is A LOT faster for writing data than converting the data into an ascii format and then save it.
07-31-2015 02:41 PM
Thanks, TDMS sounds right. I figured there was something built in. I see there is a library to read TDMS files in Matlab, which is what the client is using.
Thanks for your help, everyone.