Time deviation

Summary

Time deviation (TDEV),[1] also known as , is the time stability of phase x versus observation interval τ of the measured clock source. The time deviation thus forms a standard deviation type of measurement to indicate the time instability of the signal source. This is a scaled variant of frequency stability of Allan deviation. It is commonly defined from the modified Allan deviation, but other estimators may be used.

Time variance (TVAR) also known as is the time stability of phase versus observation interval tau. It is a scaled variant of modified Allan variance.

TDEV is a metric often used to determine an aspect of the quality of timing signals in telecommunication applications and is a statistical analysis of the phase stability of a signal over a given period. Measurements of a reference timing signal will refer to its TDEV and maximum time interval error (MTIE) values, comparing them to specified masks or goals.

Definition edit

The most common estimator uses the modified Allan variance

 

where  . The 3 in the denominator normalizes TVAR to be equal to the classical variance if the deviations in x are random and uncorrelated (white-noise).

or TDEV, which is the square-root of TVAR, may be derived from MDEV modified Allan deviation

 

References edit

  1. ^ NIST SP 1065: Handbook of Frequency Stability Analysis