Mean

Revision as of 17:38, 14 March 2016 by Donald Szlosek (talk | contribs) (→‎Arithmetic mean)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Editor-In-Chief: C. Michael Gibson, M.S., M.D. [1]


In statistics, mean has two related meanings:

It is sometimes stated that the 'mean' means average. This is incorrect as there are different types of averages: the mean, median, and mode. For instance, average house prices almost always use the median value for the average.

For a real-valued random variable X, the mean is the expectation of X. Note that not every probability distribution has a defined mean (or variance); see the Cauchy distribution for an example.

For a data set, the mean is the sum of the observations divided by the number of observations. The mean is often quoted along with the standard deviation: the mean describes the central location of the data, and the standard deviation describes the spread.

An alternative measure of dispersion is the mean deviation, equivalent to the average absolute deviation from the mean. It is less sensitive to outliers, but less mathematically tractable.

As well as statistics, means are often used in geometry and analysis; a wide range of means have been developed for these purposes, which are not much used in statistics. These are listed below.

Examples of means

Arithmetic mean

The arithmetic mean is the "standard" average, often simply called the "mean".

The mean may often be confused with the median or mode. The mean is the arithmetic average of a set of values, or distribution; however, for skewed distributions, the mean is not necessarily the same as the middle value (median), or the most likely (mode). For example, mean income is skewed upwards by a small number of people with very large incomes, so that the majority have an income lower than the mean. By contrast, the median income is the level at which half the population is below and half is above. The mode income is the most likely income, and favors the larger number of people with lower incomes. The median or mode are often more intuitive measures of such data.

That said, many skewed distributions are best described by their mean - such as the Exponential and Poisson distributions.

For example, the arithmetic mean of 34, 27, 45, 55, 22, 34 (six values) is (34+27+45+55+22+34)/6 = 217/6 ≈ 36.167.

Geometric mean

The geometric mean is an average that is useful for sets of numbers that are interpreted according to their product and not their sum (as is the case with the arithmetic mean). For example rates of growth.

<math> \bar{x} = \left ( \prod_{i=1}^n{x_i} \right ) ^{1/n}</math>

For example, the geometric mean of 34, 27, 45, 55, 22, 34 (six values) is (34×27×45×55×22×34)1/6 = 1,699,493,4001/6 ≈ 34.545.

Harmonic mean

The harmonic mean is an average which is useful for sets of numbers which are defined in relation to some unit, for example speed (distance per unit of time).

<math> \bar{x} = n \cdot \left ( \sum_{i=1}^n \frac{1}{x_i} \right ) ^{-1}</math>

For example, the harmonic mean of the numbers 34, 27, 45, 55, 22, and 34 is

<math>\frac{6}{\frac{1}{34}+\frac{1}{27}+\frac{1}{45} + \frac{1}{55} + \frac{1}{22}+\frac{1}{34}}\approx 33.0179836.</math>

Generalized means

Power mean

The generalized mean, also known as the power mean or Hölder mean, is an abstraction of the quadratic, arithmetic, geometric and harmonic means. It is defined by

<math> \bar{x}(m) = \left ( \frac{1}{n}\cdot\sum_{i=1}^n{x_i^m} \right ) ^{1/m} </math>

By choosing the appropriate value for the parameter m we get

f-mean

This can be generalized further as the generalized f-mean

<math> \bar{x} = f^{-1}\left({\frac{1}{n}\cdot\sum_{i=1}^n{f(x_i)}}\right) </math>

and again a suitable choice of an invertible <math>f</math> will give

Weighted arithmetic mean

The weighted arithmetic mean is used, if one wants to combine average values from samples of the same population with different sample sizes:

<math> \bar{x} = \frac{\sum_{i=1}^n{w_i \cdot x_i}}{\sum_{i=1}^n {w_i}} </math>

The weights <math>w_i</math> represent the bounds of the partial sample. In other applications they represent a measure for the reliability of the influence upon the mean by respective values.

Truncated mean

Sometimes a set of numbers (the data) might be contaminated by inaccurate outliers, i.e. values which are much too low or much too high. In this case one can use a truncated mean. It involves discarding given parts of the data at the top or the bottom end, typically an equal amount at each end, and then taking the arithmetic mean of the remaining data. The number of values removed is indicated as a percentage of total number of values.

Interquartile mean

The interquartile mean is a specific example of a truncated mean. It is simply the arithmetic mean after removing the lowest and the highest quarter of values.

<math> \bar{x} = {2 \over n} \sum_{i=(n/4)+1}^{3n/4}{x_i} </math>

assuming the values have been ordered.

Mean of a function

In calculus, and especially multivariable calculus, the mean of a function is loosely defined as the average value of the function over its domain. In one variable, the mean of a function f(x) over the interval (a,b) is defined by

<math>\bar{f}=\frac{1}{b-a}\int_a^bf(x)\,dx.</math>

(See also mean value theorem.) In several variables, the mean over a relatively compact domain U in a Euclidean space is defined by

<math>\bar{f}=\frac{1}{\hbox{Vol}(U)}\int_U f.</math>

This generalizes the arithmetic mean. On the other hand, it is also possible to generalize the geometric mean to functions by defining the geometric mean of f to be

<math>\exp\left(\frac{1}{\hbox{Vol}(U)}\int_U \log f\right).</math>

More generally, in measure theory and probability theory either sort of mean plays an important role. In this context, Jensen's inequality places sharp estimates on the relationship between these two different notions of the mean of a function.

Mean of angles

Most of the usual means fail on circular quantities, like angles, daytimes, fractional parts of real numbers. For those quantities you need a mean of circular quantities.

Other means

Properties

Except some examples there seems to be no consensus, what a mean actually is. However many means share some properties, that we collect here as a trial of defining the term mean.

Weighted mean

A weighted mean <math>M</math> is a function which maps tuples of positive numbers to a positive number (<math>\mathbb{R}_{>0}^n\to\mathbb{R}_{>0}</math>).

  • "Fixed point": <math> M(1,1,\dots,1) = 1 </math>
  • Homogenity: <math> \forall\lambda\ \forall x\ M(\lambda\cdot x_1, \dots, \lambda\cdot x_n) = \lambda \cdot M(x_1, \dots, x_n) </math>
(using vector notation: <math> \forall\lambda\ \forall x\ M(\lambda\cdot x) = \lambda \cdot M x </math>)
  • Monotony: <math> \forall x\ \forall y\ (\forall i\ x_i \le y_i) \Rightarrow M x \le M y </math>

It follows

  • Boundedness: <math> \forall x\ M x \in [\min x, \max x] </math>
  • Continuity: <math> \lim_{x\to y} M x = M y </math>
Sketch of a proof: Because <math>\forall x\ \forall y\ \left(||x-y||_\infty\le\varepsilon\cdot\min x \Rightarrow \forall i\ |x_i-y_i|\le\varepsilon\cdot x_i\right)</math> and <math>M((1+\varepsilon)\cdot x) = (1+\varepsilon)\cdot M x</math> it follows <math>\forall x\ \forall \varepsilon>0\ \forall y\ ||x-y||_\infty\le\varepsilon\cdot\min x \Rightarrow |Mx-My|\le\varepsilon</math>.
  • There are means, which are not differentiable. For instance, the maximum number of a tuple is considered a mean (as an extreme case of the power mean, or as a special case of a median), but is not differentiable.
  • All means listed above, with the exception of most of the Generalized f-means, satisfy the presented properties.
    • If <math>f</math> is bijective, then the generalized f-mean satisfies the fixed point property.
    • If <math>f</math> is strictly monotonic, then the generalized f-mean satisfy also the monotony property.
    • In general a generalized f-mean will miss homogenity.

The above properties imply techniques to construct more complex means:

If <math>C, M_1, \dots, M_m</math> are weighted means, <math>p</math> is a positive real number, then <math>A, B</math> with

<math> \forall x\ A x = C(M_1 x, \dots, M_m x) </math>
<math> \forall x\ B x = \sqrt[p]{C(x_1^p, \dots, x_n^p)} </math>

are also a weighted mean.

Unweighted mean

Intuitively spoken, an unweighted mean is a weighted mean with equal weights. Since our definition of weighted mean above does not expose particular weights, equal weights must be asserted by a different way. A different view on homogeneous weighting is, that the inputs can be swapped without altering the result.

Thus we define <math>M</math> being an unweighted mean if it is a weighted mean and for each permutation <math>\pi</math> of inputs, the result is the same. Let <math>P</math> be the set of permutations of <math>n</math>-tuples.

Symmetry: <math> \forall x\ \forall \pi\in P \ M x = M(\pi x) </math>

Analogously to the weighted means, if <math>C</math> is a weighted mean and <math>M_1, \dots, M_m</math> are unweighted means, <math>p</math> is a positive real number, then <math>A, B</math> with

<math> \forall x\ A x = C(M_1 x, \dots, M_m x) </math>
<math> \forall x\ B x = \sqrt[p]{M_1(x_1^p, \dots, x_n^p)} </math>

are also unweighted means.

Convert unweighted mean to weighted mean

An unweighted mean can be turned into a weighted mean by repeating elements. This connection can also be used to state that a mean is the weighted version of an unweighted mean. Say you have the unweighted mean <math>M</math> and weight the numbers by natural numbers <math>a_1,\dots,a_n</math>. (If the numbers are rational, then multiply them with the least common denominator.) Then the corresponding weighted mean <math>A</math> is obtained by

<math>A(x_1,\dots,x_n) = M(\underbrace{x_1,\dots,x_1}_{a_1},x_2,\dots,x_{n-1},\underbrace{x_n,\dots,x_n}_{a_n})</math>.

Means of tuples of different sizes

If a mean <math>M</math> is defined for tuples of several sizes, then one also expects that the mean of a tuple is bounded by the means of partitions. More precisely

  • Given an arbitrary tuple <math>x</math>, which is partitioned into <math>y_1, \dots, y_k</math>, then it holds <math>M x \in \mathrm{convexhull}(M y_1, \dots, M y_k)</math>. (See Convex hull)

Population and sample means

The mean of a normally distributed population has an expected value of μ, known as the population mean. The sample mean makes a good estimator of the population mean, as its expected value is the same as the population mean. The sample mean of a population is a random variable, not a constant, and consequently it will have its own distribution. For a random sample of n observations from a normally distributed population, the sample mean distribution is

<math>\bar{x} \thicksim N\left\{\mu, \frac{\sigma^2}{n}\right\}.</math>

Often, since the population variance is an unknown parameter, it is estimated by the mean sum of squares, which changes the distribution of the sample mean from a normal distribution to a Student's t distribution with n − 1 degrees of freedom.

Mathematics education

In many state and government curriculum standards, students are traditionally expected to learn either the meaning or formula for computing the mean by the fourth grade. However, in many standards-based mathematics curricula, students are encouraged to invent their own methods, and may not be taught the traditional method. Reform based texts such as TERC in fact discourage teaching the traditional "add the numbers and divide by the number of items" method in favor of spending more time on the concept of median, which does not require division. However, mean can be computed with a simple four-function calculator, while median requires a computer. The same teacher guide devotes several pages on how to find the median of a set, which is judged to be simpler than finding the mean.

See also

External links

Template:Statistics

da:Gennemsnit de:Mittelwert eo:Averaĝo ko:평균 it:Media (statistica) he:ממוצע nl:Gemiddelde lo:ຄ່າສະເຫຼ່ຍ no:Gjennomsnitt sl:Srednja vrednost su:Mean th:มัชฌิม fi:Keskiluku

Template:Jb1

Template:WikiDoc Sources