Online Encyclopedia Search Tool

Your Online Encyclopedia

 

Online Encylopedia and Dictionary Research Site

Online Encyclopedia Free Search Online Encyclopedia Search    Online Encyclopedia Browse    welcome to our free dictionary for your research of every kind

Online Encyclopedia



Fourier series

In mathematics, a Fourier series, named in honor of Joseph Fourier (1768-1830), is a representation of a periodic function (often taken to have period 2π -- in a sense, the simplest case) as a sum of periodic functions of the form

x\mapsto e^{inx},

which are harmonics of ei x. By Euler's formula, the series can be expressed equivalently in terms of sine and cosine functions.

Fourier was the first to study systematically such infinite series, after preliminary investigations by Euler, d'Alembert, and Daniel Bernoulli. He applied these series to the solution of the heat equation, publishing his initial results in 1807 and 1811, and publishing his Théorie analytique de la chaleur in 1822. From a modern point of view, Fourier's results are somewhat informal, due in no small part to the lack of a precise notion of function and integral in the early nineteenth century. Later, Dirichlet and Riemann expressed Fourier's results with greater precision and formality.

Many other Fourier-related transforms have since been defined, extending to other applications the initial idea of representing any periodic function as a superposition of harmonics. This general area of inquiry is now sometimes called harmonic analysis.

Contents

Definition of Fourier series

Suppose f(x) is a complex-valued function of a real number, is periodic with period 2π, and is square-integrable over the interval from 0 to 2π. Let

F_n =\frac{1}{2\pi}\int_{-\pi}^\pi f(x)\,e^{-inx}\,dx.

Then the Fourier series representation of f(x) is given by

f(x) = \sum_{n=-\infty}^{\infty} F_n \,e^{inx}.

Each term in this sum is called a Fourier mode. In the important special case of a real-valued function f(x), one often uses the identity

e^{inx}=\cos(nx)+i\sin(nx) \,\!

to equivalently represent f(x) as an infinite linear combination of functions of the form \cos(nx) \,\! and \sin(nx) \,\!, i.e.

f(x) = \frac{1}{2}a_0 + \sum_{n=1}^\infty\left[a_n\cos(nx)+b_n\sin(nx)\right], where
a_n = \frac{1}{\pi}\int_{-\pi}^\pi f(x)\cos(nx)\,dx and b_n = \frac{1}{\pi}\int_{-\pi}^\pi f(x)\sin(nx)\,dx

which corresponds to F_n = (a_n - i b_n) / 2 \,\! and F_n = F_{-n}^*.

For example problems, see www.exampleproblems.com http://www.exampleproblems.com/wiki/index.php?title=Fourier_Series .

Convergence of Fourier series

While the Fourier coefficients an and bn can be formally defined for any function for which the integrals make sense, whether the series so defined actually converges to f(x) depends on the properties of f.

The simplest answer is that if f is square-integrable then

\lim_{N\rightarrow\infty}\int_{-\pi}^\pi\left|f(x)-\sum_{n=-N}^{N} F_n\,e^{inx}\right|^2\,dx=0

(this is convergence in the norm of the space L2).

There are also many known tests that ensure that the series converges at a given point x. For example, if the function is differentiable at x. Even a jump discontinuity does not pose a problem: if the function has left and right derivatives at x, then the Fourier series will converge to the average of the left and right limits (but see Gibbs phenomenon).

However, a fact that many find surprising, is that the Fourier series of a continuous function need not converge pointwise. A discussion of the counterexample, along with other positive and negative results in the general spirit of "for functions of type X, the Fourier series converges in sense Y" may be found in Convergence of Fourier series.

Some positive consequences of the homomorphism properties of exp

Because "basis functions" eikx are homomorphisms of the real line (more precisely, of the "circle group") we have some useful identities:

  • If
g(x)=f(x-y) \,\!

then (if G is the transform of g)

G_k = e^{-iky}F_k \,\!.
  • If Hk is the transform of h = f * g \,\!, then
H_k = F_k G_k \,\!,

that is, the Fourier transform of a convolution is the product of the Fourier transforms. Vice versa, if h = fg then the Fourier transform H of h is the convolution of the Fourier transforms of f and g

H_k=\sum_{i=-\infty}^\infty F_i G_{k-i}.

Parseval's theorem

Another important property of the Fourier series is Parseval's theorem, a special case of the Plancherel theorem and a form of unitarity:

\sum_{n=-\infty}^\infty |F_n|^2 = \frac{1}{2\pi} \int_{-\pi}^\pi |f(x)|^2 dx \,

or, for the real-valued f(x) case above,

\frac{a_0^2}{4} + \frac{1}{2} \sum_{n=1}^\infty \left( a_n^2 + b_n^2 \right) = \frac{1}{2\pi} \int_{-\pi}^\pi f(x)^2 dx.

General formulation

The useful properties of Fourier series are largely derived from the orthogonality and homomorphism property of the functions e^{inx} \,\!. Other sequences of orthogonal functions have similar properties, although some useful identities concerning e.g. convolutions are no longer true once we lose the homomorphism property. Examples include sequences of Bessel functions and orthogonal polynomials. Such sequences are commonly the solutions of a differential equation; a large class of useful sequences are solutions of the so-called Sturm-Liouville problems.

See also

References

  • Yitzhak Katznelson, An introduction to harmonic analysis, Second corrected edition. Dover Publications, Inc., New York, 1976. ISBN 0486633314



Last updated: 02-10-2005 14:33:36
Last updated: 02-11-2005 17:47:38