05-12-2016 10:56 AM
05-12-2016 11:00 AM
05-12-2016 11:08 AM
05-12-2016 11:27 AM
05-12-2016 11:34 AM
05-12-2016 12:35 PM
If you read my previous post where I explained "Decimate" and "Average", the main difference between "Representations" of data (either by a single element, as in "decimate", or a statistic, as in "Mean") has to do with the statistical distribution of the measure.
Let's say your data was noisy (whose data isn't?). Let's assume you can represent it as a Gaussian Distribution with mean m and standard deviation s. If you pick a single element to represent, say, N data points (N = 50, say), this element will have mean m and standard deviation s. However, if you choose the mean of those 50 points, the mean will have mean m, but its standard deviation will be s/sqrt(N).
I once made a display where I used the Mean, and my students (who were using the software) were unhappy because they were using the "noise level" to tell if the electrodes were working properly (noisy = bad), and I'd "cleaned up" the signal too much. Simple, just use Decimate (I picked the first element of the Array).
The Mean is a good measure of what statisticians call "Central Tendency". If you have a very skewed distribution (i.e. most values are between 1 and 5, but 5% might be >100, making the Mean definitely not between 1 and 5), the Median might be better. You need to think about the distribution of your data values, and what is important to show.
Bob Schor
05-12-2016 12:52 PM
05-12-2016 05:14 PM
05-12-2016 05:14 PM
05-12-2016 05:20 PM