You don't need to be an 'investor' to invest in Singletrack: 6 days left: 95% of target - Find out more
Why would a Quadratic mean be a better representation of an average than and old fashioned add up the numbers and divide by how many numbers there are.
Bit of context, measuring a number of similar but not exactly the same items which fall within a range, produces a bell curve but apparently working out the Quadratic mean is a truer average.
Have you not just described a Standard Deviation?
Regardless: "average" is an ambiguous term so I'd hazard it'd entirely depends on what you mean by "better." What are you trying to achieve?
ObDisclaimer: I am not a Mathematician, I flunked A Level Maths 30 years ago.
Have you not just described a Standard Deviation?
almost. Quadratic mean is a now rarely used term for the RMS. (Root of the mean of the squares). Standard deviation is the RMS of the deviations from the mean.
If you’ve got a bell curve, also known as a normal or a Gaussian distribution, then the arithmetic (good old fashioned) mean and the standard deviation are sufficient to describe it.
There are tests you can apply to check it really is a normal distribution
Quadratic mean/rms can be a better measure if you’re dealing with a mix of positive and negative values.
Quadratic mean will always* be higher than the simple mean - that's got to be better, right ?
* unless all yr numbers are the same
RMS is used for measuring AC voltage.
Imagine your AC voltage coming into your house.
The average (mean) is zero. But trust me, despite that, it really hurts if you touch it.
The peak voltage is about 320V and the RMS is 230V.