12-27-2021 07:36 AM
There is a file with data from sensor, 40 channels, recording every second.
It is necessary to average at intervals of 6 or 10 seconds.
there are 2 questions:
1st. How to average data feeds.
2nd. There is no need to average the time channel, I just need to take every 6 or 10 seconds.
Thanks.
01-03-2022 04:54 AM
Hi Spiky,
For this you can use the “Reducing Classification”.
Select how to reduce (mean for example)
And define the width for the reducing
Greetings
Walter
03-24-2022 05:38 AM
in my file time goes in the format mm/dd/yyyy hh.mm.ss.ms . The time step is 10 seconds. it can be seen in the picture.
but when I give the averaging of the diadems, it presents me the start and end times (I think that it was correctly determined) in the picture.
How to calculate step. I need 60 sec. how to convert seconds to the format in which the time is indicated in lab view.
or how to set another timeline as the timeline. I have a serial number scale.
03-24-2022 06:52 AM
in my file, the time is given in the format mm/dd/yyyy hh:mm:ss:ms**. Step 10 seconds
the time of the beginning and the end of the Diadem determined himself (I hope that is correct)
the question is how do i set the step correctly. what is the formula to translate 60 sec. 100 sec?
03-24-2022 07:15 AM
Hi Spiky,
The normal time channel is internally a double value in steps of 1 second, starting at year 0. To translate a date/time value into the numeric representation you can use the function TTR (more info you can find in the DIAdem help). This function can also be used in the dialog. The step width of 10 s is ttr(“00.00.0000 00:00:10”) or just 10.
Greetings
Walter