Search

The Online Encyclopedia and Dictionary

 
     
 

Encyclopedia

Dictionary

Quotes

 

Spectral theorem

In mathematics, particularly linear algebra and functional analysis, the spectral theorem is a collection of results about linear operators or about matrices. In broad terms the spectral theorem provides conditions under which an operator or a matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This concept of diagonalization is relatively straightforward for operators on finite-dimensional spaces, but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of linear operators that can be modelled by multiplication operators, which are as simple as one can hope to find. See also spectral theory for a historical perspective.

Examples of operators to which the spectral theorem applies are self-adjoint operators or more generally normal operators on Hilbert spaces.

The spectral theorem also provides a canonical decomposition, called the spectral decomposition of the underlying vector space on which it acts.

In this article we consider mainly the simplest kind of spectral theorem, that for a self-adjoint operator on a Hilbert space. However, as noted above, for a Hilbert space, the spectral theorem also holds for normal operators.

Contents

Finite-dimensional case

We begin by considering a symmetric operator A on a finite-dimensional real or complex inner product space V with the standard Hermitian inner product; the symmetry condition means

\langle A x \mid y \rangle = \langle x \mid A y \rangle

for all x,y elements of V. Recall that an eigenvector of a linear operator A is a vector x such that A x = r x for some scalar r. The value r is the corresponding eigenvalue.

Theorem. There is an orthonormal basis of V consisting of eigenvectors of A. Each eigenvalue is real.

This result is of such importance in many parts of mathematics, that we provide a sketch of a proof in case the underlying field of scalars is the complex numbers. First the property that all the eigenvalues are real. Indeed if λ is an eigenvalue of A, for the corresponding eigenvector x

\overline{\lambda} \langle x \mid x \rangle= \langle A x \mid x \rangle = \langle x \mid A x \rangle = \lambda \langle x \mid x \rangle .

It follows λ equals its own conjugate and is therefore real.

To prove the existence of an eigenvector basis, we use induction on the dimension of V. In fact it suffices to show A has at least one non-zero eigenvector e. For then we can consider the space K of vectors v orthogonal to e. This is finite-dimensional, and A has the property that it maps every vector w in K into K:

\langle A w \mid e \rangle = \langle w \mid A e \rangle = \lambda \langle w \mid e \rangle = 0.

Moreover, A considered as a linear operator on K is also symmetric so by the induction hypothesis this completes the proof.

It remains however to show A has at least one eigenvector. Since the ground field is algebraically closed, the polynomial function p(x) = det(Ax I) has a root r. This implies the linear operator Ar I is not invertible and hence maps a non-zero vector e to 0. This vector e is a non-zero eigenvector of A. This completes the proof.

The spectral theorem is also true for symmetric operators on finite-dimensional real inner product spaces.

The spectral decomposition of an operator A which has an orthonormal basis of eigenvectors, is obtained by grouping together all vectors corresponding to the same eigenvalue. Thus

V_\lambda = \{\,v \in V: A v = \lambda v\,\}.

Note: these spaces are invariantly defined, that is does not require any choice of specific eigenvectors.

As an immediate consequence of the spectral theorem for symmetric operators we get the spectral decomposition theorem: V is the orthogonal direct sum of the spaces Vλ where the index ranges over eigenvalues. Another equivalent formulation is letting Pλ be the orthogonal projection onto Vλ

P_\lambda P_\mu=0 \quad \mbox{if } \lambda \neq \mu

and if λ1,..., λm are the eigenvalues of A,

A =\lambda_1 P_{\lambda_1} +\cdots+\lambda_m P_{\lambda_m}.

If A is a normal operator on a finite-dimensional inner product space, A also has a spectral decomposition and the decomposition theorem holds for A. The eigenvalues will be complex numbers in general. The proof is somewhat more complicated and is discussed in the Axler reference below.

These results translate immediately into results about matrices: For any normal matrix A, there exists a unitary matrix U such that

A=U \Sigma U^* \;

where Σ is the diagonal matrix where the entries are the eigenvalues of A. Furthermore, any matrix which diagonalizes in this way must be normal.

The column vectors of U are the eigenvectors of A and they are orthogonal.

The spectral decomposition is a special case of the Schur decomposition. It is also a special case of the singular value decomposition.

If A is a real symmetric matrix, it follows by the real version of the spectral theorem for symmetric operators that there is an orthogonal matrix such that U A U* is diagonal and all the eigenvalues of A are real.

The spectral theorem for compact self-adjoint operators

In Hilbert spaces in general, the statement of the spectral theorem for compact self-adjoint operators is virtually the same as in the finite-dimensional case.

Theorem. Suppose A is a compact self-adjoint operator on a Hilbert space V. There is an orthonormal basis of V consisting of eigenvectors of A. Each eigenvalue is real.

Again the key point is to prove the existence of at least one nonzero eigenvector. To prove this, we cannot rely on determinants to show existence of eigenvalues, but instead we use a maximization argument analogous to proving the min-max theorem for eigenvalues.

Note that the above spectral theorem holds for real or complex Hilbert spaces.

Functional analysis

The next generalization we consider is that of bounded self-adjoint operators A on a Hilbert space V. Such operators may have no eigenvalues: for instance let A be the operator multiplication by t on L2[0, 1], that is

[A \varphi](t) = t \varphi(t). \;

Theorem. Let A be a bounded self-adjoint operator on a Hilbert space H. Then there is a measure space (X, M, μ) and a real-valued measurable function f on X and a unitary operator U:HL2μ(X) such that

U^* T U = A \;

where T is the multiplication operator:

[T \varphi](x) = f(x) \varphi(x). \;

This is the beginning of the vast research area of functional analysis called operator theory.

A normal operator on a Hilbert space may have no eigenvalues; for example, the bilateral shift on the Hilbert space l2(Z) has no eigenvalues. There is also a spectral theorem for normal operators on Hilbert spaces, though, in which the sum in the finite-dimensional spectral theorem is replaced by an integral of the coordinate function over the spectrum against a projection-valued measure.

When the normal operator in question is compact, this spectral theorem reduces to the finite-dimensional spectral theorem above, except that the operator is expressed as a linear combination of possibly infinitely many projections.

The spectral theorem for general self-adjoint operators

Many important linear operators which occur in analysis, such as differential operators are unbounded. There is however a spectral theorem self-adjoint operators which applies in many of these cases. To give an example, any constant coefficient differential operator is unitarily equivalent to a multiplication operator. Indeed the unitary operator which implements this equivalence is the Fourier transform.

See also

Reference

  • Sheldon Axler, Linear Algebra Done Right, Springer Verlag, 1997
The contents of this article are licensed from Wikipedia.org under the GNU Free Documentation License. How to see transparent copy