LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Sort data in a stream of multiple sensors, one sensor sends data with regular intervals another sends data blocks on a non regular time base

Hello I hope you can help me, dont know where to start looking. I tried to simplify the example a bit.

The sensors have this format  as an output:  

TimeStamp (s); MeasuredValue

For this question I added the time Labview receives the data , this time is not in the stream, so the format in my question becomes:

ReciveTime (s);TimeStamp (s); MeasuredValue

 

 

Sensor A wil give:

1.0; 1.0 ; measured value 

2.0; 2.0 ; measured value 

3.0; 3.0 ; measured value

4.0; 4.0 ; measured value

5.0 ; 5.0; measured value

 

 

Sensor B wil give multiple values in a block:

2.2; 1.1 ; measured value 

2.2; 1.2 ; measured value 

2.2; 1.3 ; measured value

2.2; 1.4 ; measured value

2.2 ;1.5 ; measured value

 

I need an output witch looks like this:

Timestamp; SensorA; SensorB

 

So a part of the data combined becomes:

 

 

1.0; measured value A; NaN

1.1; NaN; measured value B

1.2; NaN; measured value B

 

In reality there are more sensors than 2, and it is more complex but I try to keep my question simple :-), there will be a lot of data I think about 200 samples in a second. This doesnt sound like a lot, but the measurement might run for long times. The data needs to be saved to a file eventually. I know that if I want to sort streams of data I need the correct functions else the system might run into resource problems, that is one of the reasons I ask this question. 

 

I hope one of you can help me! Thanx ! 

 

  

 

 

 

 

 

 

0 Kudos
Message 1 of 10
(1,259 Views)

Can you describe better your two data streams?  Is one a continuous sampling of analog (or digital) data, at a fixed rate, without pauses, while the other occurs "irregularly" in time, at a much slower rate?  Is there a compelling reason that both need to be in the same file?

 

The "regular" stream suggests you are "sampling" data, perhaps with DAQmx, which allows you to have the hardware give you arrays of data, all of fixed sizes, all with a built-in Time Stamp, as the data format that NI calls a "Waveform".  You can easily set up a method to continuously stream these waveforms  to disk.  The other, slower, "irregular" data stream can be saved as "pairs" of entries, a TimeStamp (saying "when") and the data, itself (saying "What").  

 

LabVIEW has other formats for saving data, several of which I've not used (because my data tends to be "regular" and "synchronous"), but I'm sure if you look at "Data Formats" in LabVIEW Help or wait for someone else to reply, suggesting other Formats for you to investigate, you may come up with a more appropriate data format.

 

Your description of what you want to do was a little vague.  To get more useful help, you'll have to put more effort into showing us what you are trying to do.  Please do not show us "pictures" of fragments of LabVIEW code.  Also note that if you are using LabVIEW 2023, many of us (I, among them) will be unable to open any code that you save in that version.  We do encourage you to post your code, but if using the most recent version of LabVIEW, please use the File Menu and "Save for Previous Version", going back 2 or 4 years.

 

Bob Schor

0 Kudos
Message 2 of 10
(1,203 Views)

Had a maybe similar task long ago...

multiple instruments , all having it's own (different) timebase, collected via multiple RS232 ...

 

Now the operator want's to monitor the data: easy, collect all data streams with timestamps in seperate arrays , cut out the time window of interest (last hour) and use a XY diagram.

 

I stored the data in individual files.

 

But some guys wanted ONE 'clean' 'regular' chart with one timestamp and all values at that time.

Ah... what timestamp? 

 

use <systemtime> and the last actual value according to timestamp?

 

use <systemtime> and the nearest value according to timestamp?

 

use <systemtime> and interpolate (linear?) from the values ?

 

You (or your boss) have to make a decision.

 

<systemtime> could be any other timesource ...  unless it's not all GPS disciplined (or based on one master clock) your in hell anyway  😉  (check the clocks over your multi day timeframe , a second is lost quickly)

No matter what you choose, keep the raw data (by channel including the timestamps) !

 

Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


0 Kudos
Message 3 of 10
(1,162 Views)

Hello Bob,

 

The data is coming in from a measurement device which uses some ADC's and a microcontroller. The datastream bandwidth is limited for some reason. So every now and then they just skip a message, to send another message which is not of intrest to me. Sometimes there is another delay in the measurements shifting all the measurements data stamps. I can not use another device. The measurement data is received in a array. The length of the array can also change in length.  

 

Then there is another measurement device just measuring in a steady pace.  

 

Yes these measurements need to go into the same file.

 

I did take a look at some different datatypes:

 

waveform

is not suitable as it needs to have a steady dt value (dt as the time between measurements).

 

 

Dynamic data

I think, but I am not sure, it also has got a steady dt. But there are a lot of functions available, maybe I can use Align and Resample to get things done. But I am not sure. If someone can tell please do so!

 

thanks for the help, I give it another half hour or so before my day is done.

 

 

 

 

 

 

 

 

0 Kudos
Message 4 of 10
(1,151 Views)

Hello Henrik,

 

what you describe gets close to my situation. There is one exception there is already a timestamp included.  I did try to clarify it a little bit better in the post above. 

 

Maybe I can interpolate but I do not know it that works with timestamps which dont use a steady dt.

0 Kudos
Message 5 of 10
(1,148 Views)

Ask yourself WHY you need all channels with one timestamp, even if they are already timestamped.

For a nice diagram? Some simple minded want to create nice excel sheets and don't know XY diagrams?

 

if your not hard synced sources send a value (about) every second  and you want to write one dataset of all channels every second over a longer periode, there will be some sources a bit slower and some a bit faster, some will change the speed a little while the test is running, shurely ending in the situiation where there is only old data at a channel or you have to two values inbetween your global second. Communicate that problem.

 

You can collect some seconds of data and resample all channels to one timestamp, depending of the sources you can choose maybe a better than linear interpolation....(spline, with some more math even a DFT on 'random' sampled data is possible ....)

 

HDF migth be a better solution to stream asymchronious timestamped data to disk. even has powerfull lossless compression buildin. however some 1000 values per second isn't a big deal nowadays. (just by experiemce I can tell you that there will a situation where you (or someone else) want all that data with the individual timestamps 😉 )

 

How I would try it: Create a (lossy?) queue for every channel with some (10?) elements , the producers (your data collectors) fill it (value and up to two timestamps) ,    the queue references and a enum with the interpolation method for that channel (and maybe additional channel info) fill another array of cluster.

 

a consumer loop now run for every 'global' timestamp through that queue array, and do whatever you want to do with the data.

 

some extra bells in the producer and the consumer loop can/should check for data or communication loss

 

Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


0 Kudos
Message 6 of 10
(1,133 Views)

Hello Henrik, Thank you for your answer. It might be I am tired at the end of the week. But I think I dont get it, could you please give some example code? meanwhile  I will give it another read. 

0 Kudos
Message 7 of 10
(1,094 Views)

Hello Bob I did give it a try to sort the data. I did make a small program to simulate the data I am receiving. There are 2 sensors I try to simulate a "normal one" with a stable dt and a Sensor which gives me blocks of data. As you can see I can simulate a situation where the data is added in a not correct synchronized manner  

 

Its_Me_0-1689454049568.png

 

My (simplified) sorting function works fine. I wonder though if this is the normal way this is done. Labview is loaded with handy standard functions and I wonder if there is no better easier (?) way. A better way so I could use more standard functions to process the measurement data. There are standard functions for waveforms and there is dynamic data etc etc. These functions might interact easier than my method. With interact I mean save the data to files, interpolate etc.

 

The data in the blocks is about 20Hz and a block can contain about 25 measurement values. The normal signal can be between1 and 10 Hz and contains 1 sample a time. When the normal sensor is set on a measurement frequency then this frequency is kept the same for the whole measurement.

 

I receive the blocks of data via a dll. I can check the values and they are okay. The data from the dll comes in in a array format as in the sort tetst program I made. I can not share the dll, but the testprogram I did include creates data in the same format.

 

I hope you can help me out. 

 

Thanx !

 

 

 

 

0 Kudos
Message 8 of 10
(1,070 Views)

Please correct me if I am wrong, but what you are saying is, you would do something as in my testprogram I did just add. Only you would send the measurement data using an Queue with extra info how to post processes the data e.g. what kind of interpolation needed?

0 Kudos
Message 9 of 10
(1,064 Views)

Thanks for attaching "code I could read".  I have a number of suggestions that you are free to reject (because you "need to do it another way"), but are based on my own fairly-long history with collecting and analyzing "real" data.

 

  • LabVIEW TimeStamps are a blessing and a curse.  Sometimes you need the actual "time of an event", and that's the Blessing.  Most times, however, you only need what I'll call "relative time", as in "how many seconds after I started the program did these data arrive" (so "time" effectively starts from 0).
  • Hardware "clocks" are a definite blessing.  They tend to be far more useful for data collection than the PC's clock (which doesn't know when the data are acquired, just where the CPU was when it handled the data.
  • The LabVIEW "Waveform" construct is designed for "hardware sampling", i.e. with a hardware (precise) clock, and lets you "have your cake and eat it, too".  If you want absolute time (seconds since 1 Jan 1904 UTC), you can use t0.  If you want relative time (seconds since the start of data collection), you can derive it from dt.
  • Data acquired using the same clock can be saved as an "array of channels".  Data acquired using different clocks cannot easily be saved together, as potentially every point needs its own time component.  You have to make a choice about how to handle "chaotic" data -- consider doing some form of "resampling", i.e. converting it "on the fly" as though it had been sampled at the same times as your "regular" data.
  • With all this talk about "time", consider using Relative Time on Charts and Graphs -- is the date and "clock time" really what you want to show on the X axis, or is it the hours (or minutes, or milliseconds) since, say, "Start of Data Collection"?

What I do is to start a millisecond timer going when I start Data Collection.  If I use an I32, I can record for about 7 weeks before the Timer "rolls over" (and my recording sessions usually last hours).  I have a separate file that I call an "Events" file for all the asynchronous data (typically saved as a cluster that includes the Timer, the nature of the Event, and any Event-specific information).  All the other Data Files that are sampled with DAQmx go in their own "File of Waveforms" files.

 

Bob Schor

0 Kudos
Message 10 of 10
(1,051 Views)