Decorrelation

Summary

Decorrelation is a general term for any process that is used to reduce autocorrelation within a signal, or cross-correlation within a set of signals, while preserving other aspects of the signal.[1] A frequently used method of decorrelation is the use of a matched linear filter to reduce the autocorrelation of a signal as far as possible. Since the minimum possible autocorrelation for a given signal energy is achieved by equalising the power spectrum of the signal to be similar to that of a white noise signal, this is often referred to as signal whitening.

Process

edit

Most decorrelation algorithms are linear, but there are also non-linear decorrelation algorithms.

Many data compression algorithms incorporate a decorrelation stage.[2] For example, many transform coders first apply a fixed linear transformation that would, on average, have the effect of decorrelating a typical signal of the class to be coded, prior to any later processing. This is typically a Karhunen–Loève transform, or a simplified approximation such as the discrete cosine transform.

By comparison, sub-band coders do not generally have an explicit decorrelation step, but instead exploit the already-existing reduced correlation within each of the sub-bands of the signal, due to the relative flatness of each sub-band of the power spectrum in many classes of signals.

Linear predictive coders can be modelled as an attempt to decorrelate signals by subtracting the best possible linear prediction from the input signal, leaving a whitened residual signal.

Decorrelation techniques can also be used for many other purposes, such as reducing crosstalk in a multi-channel signal, or in the design of echo cancellers.

In image processing decorrelation techniques can be used to enhance or stretch, colour differences found in each pixel of an image. This is generally termed as 'decorrelation stretching'.

The concept of decorrelation can be applied in many other fields. In neuroscience, decorrelation is used in the analysis of the neural networks in the human visual system. In cryptography, it is used in cipher design (see Decorrelation theory) and in the design of hardware random number generators.

See also

edit

References

edit
  1. ^ "Decorrelation - an overview | ScienceDirect Topics". www.sciencedirect.com. Retrieved 2020-09-25.
  2. ^ "Data Compression - an overview | ScienceDirect Topics". www.sciencedirect.com. Retrieved 2020-09-25.
edit
  • Non-linear decorrelation algorithms
  • Associative Decorrelation Dynamics in Visual Cortex