In mathematics, a Fourier series, named in honor of Joseph Fourier (1768-1830), is a representation of a periodic function (often taken to have period 2π -- in a sense, the simplest case) as a sum of periodic functions of the form
Fourier was the first to study systematically such infinite series, after preliminary investigations by Euler, d'Alembert, and Daniel Bernoulli. He applied these series to the solution of the heat equation, publishing his initial results in 1807 and 1811, and publishing his Théorie analytique de la chaleur in 1822. From a modern point of view, Fourier's results are somewhat informal, due in no small part to the lack of a precise notion of function and integral in the early nineteenth century. Later, Dirichlet and Riemann expressed Fourier's results with greater precision and formality.
Many other Fourier-related transforms have since been defined, extending to other applications the initial idea of representing any periodic function as a superposition of harmonics. This general area of inquiry is now sometimes called harmonic analysis.
Definition of Fourier series
Suppose f(x) is a complex-valued function of a real number, is periodic with period 2π, and is square-integrable over the interval from 0 to 2π. Let
Then the Fourier series representation of f(x) is given by
Each term in this sum is called a Fourier mode. In the important special case of a real-valued function f(x), one often uses the identity
to equivalently represent f(x) as an infinite linear combination of functions of the form and , i.e.
- , where
which corresponds to and
For example problems, see www.exampleproblems.com .
Convergence of Fourier series
While the Fourier coefficients an and bn can be formally defined for any function for which the integrals make sense, whether the series so defined actually converges to f(x) depends on the properties of f.
The simplest answer is that if f is square-integrable then
(this is convergence in the norm of the space L2).
There are also many known tests that ensure that the series converges at a given point x. For example, if the function is differentiable at x. Even a jump discontinuity does not pose a problem: if the function has left and right derivatives at x, then the Fourier series will converge to the average of the left and right limits (but see Gibbs phenomenon).
However, a fact that many find surprising, is that the Fourier series of a continuous function need not converge pointwise. A discussion of the counterexample, along with other positive and negative results in the general spirit of "for functions of type X, the Fourier series converges in sense Y" may be found in Convergence of Fourier series.
Some positive consequences of the homomorphism properties of exp
Because "basis functions" eikx are homomorphisms of the real line (more precisely, of the "circle group") we have some useful identities:
then (if G is the transform of g)
- If Hk is the transform of , then
that is, the Fourier transform of a convolution is the product of the Fourier transforms. Vice versa, if h = fg then the Fourier transform H of h is the convolution of the Fourier transforms of f and g
or, for the real-valued f(x) case above,
The useful properties of Fourier series are largely derived from the orthogonality and homomorphism property of the functions . Other sequences of orthogonal functions have similar properties, although some useful identities concerning e.g. convolutions are no longer true once we lose the homomorphism property. Examples include sequences of Bessel functions and orthogonal polynomials. Such sequences are commonly the solutions of a differential equation; a large class of useful sequences are solutions of the so-called Sturm-Liouville problems.
- Yitzhak Katznelson, An introduction to harmonic analysis, Second corrected edition. Dover Publications, Inc., New York, 1976. ISBN 0486633314