BREAKING NEWS

## Summary

A weight function is a mathematical device used when performing a sum, integral, or average to give some elements more "weight" or influence on the result than other elements in the same set. The result of this application of a weight function is a weighted sum or weighted average. Weight functions occur frequently in statistics and analysis, and are closely related to the concept of a measure. Weight functions can be employed in both discrete and continuous settings. They can be used to construct systems of calculus called "weighted calculus" and "meta-calculus".

## Discrete weights

### General definition

In the discrete setting, a weight function $w\colon A\to \mathbb {R} ^{+}$  is a positive function defined on a discrete set $A$ , which is typically finite or countable. The weight function $w(a):=1$  corresponds to the unweighted situation in which all elements have equal weight. One can then apply this weight to various concepts.

If the function $f\colon A\to \mathbb {R}$  is a real-valued function, then the unweighted sum of $f$  on $A$  is defined as

$\sum _{a\in A}f(a);$

but given a weight function $w\colon A\to \mathbb {R} ^{+}$ , the weighted sum or conical combination is defined as

$\sum _{a\in A}f(a)w(a).$

One common application of weighted sums arises in numerical integration.

If B is a finite subset of A, one can replace the unweighted cardinality |B| of B by the weighted cardinality

$\sum _{a\in B}w(a).$

If A is a finite non-empty set, one can replace the unweighted mean or average

${\frac {1}{|A|}}\sum _{a\in A}f(a)$

by the weighted mean or weighted average

${\frac {\sum _{a\in A}f(a)w(a)}{\sum _{a\in A}w(a)}}.$

In this case only the relative weights are relevant.

### Statistics

Weighted means are commonly used in statistics to compensate for the presence of bias. For a quantity $f$  measured multiple independent times $f_{i}$  with variance $\sigma _{i}^{2}$ , the best estimate of the signal is obtained by averaging all the measurements with weight ${\textstyle w_{i}=1/{\sigma _{i}^{2}}}$ , and the resulting variance is smaller than each of the independent measurements ${\textstyle \sigma ^{2}=1/\sum _{i}w_{i}}$ . The maximum likelihood method weights the difference between fit and data using the same weights $w_{i}$ .

The expected value of a random variable is the weighted average of the possible values it might take on, with the weights being the respective probabilities. More generally, the expected value of a function of a random variable is the probability-weighted average of the values the function takes on for each possible value of the random variable.

In regressions in which the dependent variable is assumed to be affected by both current and lagged (past) values of the independent variable, a distributed lag function is estimated, this function being a weighted average of the current and various lagged independent variable values. Similarly, a moving average model specifies an evolving variable as a weighted average of current and various lagged values of a random variable.

### Mechanics

The terminology weight function arises from mechanics: if one has a collection of $n$  objects on a lever, with weights $w_{1},\ldots ,w_{n}$  (where weight is now interpreted in the physical sense) and locations ${\boldsymbol {x}}_{1},\dotsc ,{\boldsymbol {x}}_{n}$ , then the lever will be in balance if the fulcrum of the lever is at the center of mass

${\frac {\sum _{i=1}^{n}w_{i}{\boldsymbol {x}}_{i}}{\sum _{i=1}^{n}w_{i}}},$

which is also the weighted average of the positions ${\boldsymbol {x}}_{i}$ .

## Continuous weights

In the continuous setting, a weight is a positive measure such as $w(x)\,dx$  on some domain $\Omega$ , which is typically a subset of a Euclidean space $\mathbb {R} ^{n}$ , for instance $\Omega$  could be an interval $[a,b]$ . Here $dx$  is Lebesgue measure and $w\colon \Omega \to \mathbb {R} ^{+}$  is a non-negative measurable function. In this context, the weight function $w(x)$  is sometimes referred to as a density.

### General definition

If $f\colon \Omega \to \mathbb {R}$  is a real-valued function, then the unweighted integral

$\int _{\Omega }f(x)\ dx$

can be generalized to the weighted integral

$\int _{\Omega }f(x)w(x)\,dx$

Note that one may need to require $f$  to be absolutely integrable with respect to the weight $w(x)\,dx$  in order for this integral to be finite.

### Weighted volume

If E is a subset of $\Omega$ , then the volume vol(E) of E can be generalized to the weighted volume

$\int _{E}w(x)\ dx,$

### Weighted average

If $\Omega$  has finite non-zero weighted volume, then we can replace the unweighted average

${\frac {1}{\mathrm {vol} (\Omega )}}\int _{\Omega }f(x)\ dx$

by the weighted average

${\frac {\int _{\Omega }f(x)\,w(x)\,dx}{\int _{\Omega }w(x)\,dx}}$

### Bilinear form

If $f\colon \Omega \to {\mathbb {R} }$  and $g\colon \Omega \to {\mathbb {R} }$  are two functions, one can generalize the unweighted bilinear form

$\langle f,g\rangle :=\int _{\Omega }f(x)g(x)\ dx$

to a weighted bilinear form

$\langle f,g\rangle :=\int _{\Omega }f(x)g(x)\ w(x)\ dx.$

See the entry on orthogonal polynomials for examples of weighted orthogonal functions.