KNOWPIA
WELCOME TO KNOWPIA

In the mathematical discipline of linear algebra, the **Schur decomposition** or **Schur triangulation**, named after Issai Schur, is a matrix decomposition. It allows one to write an arbitrary complex square matrix as unitarily equivalent to an upper triangular matrix whose diagonal elements are the eigenvalues of the original matrix.

The Schur decomposition reads as follows: if A is an *n* × *n* square matrix with complex entries, then *A* can be expressed as^{[1]}^{[2]}^{[3]}
for some unitary matrix *Q* (so that the inverse *Q*^{−1} is also the conjugate transpose *Q** of *Q*), and some upper triangular matrix *U*. This is called a **Schur form** of *A*. Since *U* is similar to *A*, it has the same spectrum, and since it is triangular, its eigenvalues are the diagonal entries of *U*.

The Schur decomposition implies that there exists a nested sequence of *A*-invariant subspaces {0} = *V*_{0} ⊂ *V*_{1} ⊂ ⋯ ⊂ *V _{n}* =

A constructive proof for the Schur decomposition is as follows: every operator *A* on a complex finite-dimensional vector space has an eigenvalue *λ*, corresponding to some eigenspace *V _{λ}*. Let

The above argument can be slightly restated as follows: let *λ* be an eigenvalue of *A*, corresponding to some eigenspace *V _{λ}*.

Although every square matrix has a Schur decomposition, in general this decomposition is not unique. For example, the eigenspace *V _{λ}* can have dimension > 1, in which case any orthonormal basis for

Write the triangular matrix *U* as *U* = *D* + *N*, where *D* is diagonal and *N* is strictly upper triangular (and thus a nilpotent matrix). The diagonal matrix *D* contains the eigenvalues of *A* in arbitrary order (hence its Frobenius norm, squared, is the sum of the squared moduli of the eigenvalues of *A*, while
the Frobenius norm of *A*, squared, is the sum of the squared singular values of *A*). The nilpotent part *N* is generally not unique either, but its Frobenius norm is uniquely determined by *A* (just because the Frobenius norm of A is equal to the Frobenius norm of *U* = *D* + *N*).^{[5]}

It is clear that if *A* is a normal matrix, then *U* from its Schur decomposition must be a diagonal matrix and the column vectors of *Q* are the eigenvectors of *A*. Therefore, the Schur decomposition extends the spectral decomposition. In particular, if *A* is positive definite, the Schur decomposition of *A*, its spectral decomposition, and its singular value decomposition coincide.

A commuting family {*A _{i}*} of matrices can be simultaneously triangularized, i.e. there exists a unitary matrix

In the infinite dimensional setting, not every bounded operator on a Banach space has an invariant subspace. However, the upper-triangularization of an arbitrary square matrix does generalize to compact operators. Every compact operator on a complex Banach space has a nest of closed invariant subspaces.

The Schur decomposition of a given matrix is numerically computed by the QR algorithm or its variants. In other words, the roots of the characteristic polynomial corresponding to the matrix are not necessarily computed ahead in order to obtain its Schur decomposition. Conversely, the QR algorithm can be used to compute the roots of any given characteristic polynomial by finding the Schur decomposition of its companion matrix. Similarly, the QR algorithm is used to compute the eigenvalues of any given matrix, which are the diagonal entries of the upper triangular matrix of the Schur decomposition. Although the QR algorithm is formally an infinite sequence of operations, convergence to machine precision is practically achieved in operations.^{[6]}
See the Nonsymmetric Eigenproblems section in LAPACK Users' Guide.^{[7]}

Lie theory applications include:

- Every invertible operator is contained in a Borel group.
- Every operator fixes a point of the flag manifold.

Given square matrices *A* and *B*, the **generalized Schur decomposition** factorizes both matrices as and , where *Q* and *Z* are unitary, and *S* and *T* are upper triangular. The generalized Schur decomposition is also sometimes called the **QZ decomposition**.^{[2]}^{: 375 }
^{[8]}

The generalized eigenvalues that solve the generalized eigenvalue problem (where **x** is an unknown nonzero vector) can be calculated as the ratio of the diagonal elements of *S* to those of *T*. That is, using subscripts to denote matrix elements, the *i*th generalized eigenvalue satisfies .

**^**Horn, R.A. & Johnson, C.R. (1985).*Matrix Analysis*. Cambridge University Press. ISBN 0-521-38632-2. (Section 2.3 and further at p. 79)- ^
^{a}^{b}Golub, G.H. & Van Loan, C.F. (1996).*Matrix Computations*(3rd ed.). Johns Hopkins University Press. ISBN 0-8018-5414-8.(Section 7.7 at p. 313) **^**Schott, James R. (2016).*Matrix Analysis for Statistics*(3rd ed.). New York: John Wiley & Sons. pp. 175–178. ISBN 978-1-119-09247-6.**^**Wagner, David. "Proof of Schur's Theorem" (PDF).*Notes on Linear Algebra*.**^**Higham, Nick. "What Is a Schur Decomposition?".**^**Trefethen, Lloyd N.; Bau, David (1997).*Numerical linear algebra*. Philadelphia: Society for Industrial and Applied Mathematics. pp. 193–194. ISBN 0-89871-361-7. OCLC 36084666.`{{cite book}}`

: CS1 maint: date and year (link)**^**Anderson, E; Bai, Z; Bischof, C; Blackford, S; Demmel, J; Dongarra, J; Du Croz, J; Greenbaum, A; Hammarling, S; McKenny, A; Sorensen, D (1995).*LAPACK Users guide*. Philadelphia, PA: Society for Industrial and Applied Mathematics. ISBN 0-89871-447-8.**^**Daniel Kressner: "Numerical Methods for General and Structured Eigenvalue Problems", Chap-2, Springer, LNCSE-46 (2005).