07-09-2012 11:20 AM
I'm new to LabVIEW but I am trying to make a VI that will read in a bunch of analog values and average them once every second, then record a this value in a spreadsheet.
Basically, I want a current and voltage reading every second.
Right now I cannot seem to wrap my head around how to determine the sampling interval of my VI it just seems like its all over the place. Can some but take a look a maybe give me a few help me understand the timing of my VI?
07-09-2012 01:31 PM
You have configured your hardware to do continuous acquisition at 1kHz, according to the default settings on the front panel. You may change these before you run your code. Then, in the while loop, you wait for 10ms or 100 samples (the "Samples to Read" input) on each channel. I'm not sure why you split the channels, then merge them again. Finally you are writing to a file in a strange format: you have a 2D array, and you're putting a timestamp before only the first row of that array, so if you attempt to read that data into Excel and you expect the first column to be the timestamp you'll get weird results. Is that the problem? Otherwise I don't see how there's a question about timing since you are using the hardware clock.
07-10-2012 12:23 PM