Search

The Online Encyclopedia and Dictionary

 
     
 

Encyclopedia

Dictionary

Quotes

 

Distribution

This page deals with mathematical distributions. For other meanings of distribution, see distribution (disambiguation). This article is not about probability distributions.

In mathematical analysis, distributions (also known as generalized functions) are objects which generalize functions and probability distributions. They extend the concept of derivative to all continuous functions and beyond and are used to formulate generalized solutions of partial differential equations. They are important in physics and engineering where many non-continuous problems naturally lead to differential equations whose solutions are distributions, such as the Dirac delta distribution.

"Generalized functions" were introduced by Sergei Sobolev in 1935. They were independently discovered in late 1940s by Laurent Schwartz, who developed a comprehensive theory of distributions.

Sometimes, people talk of a "probability distribution" when they just mean "probability measure", especially if it is obtained by taking the product of the Lebesgue measure by a positive, real-valued measurable function of integral equal to 1.

Contents

Basic idea

The basic idea is as follows. If f : RR is an integrable function, and φ : RR is a smooth (that is, infinitely differentiable) function with compact support (that is, it is identically zero except on some bounded set), then ∫fφdx is a real number which linearly and continuously depends on φ. One can therefore think of the function f as a continuous linear functional on the space which consists of all the "test functions" φ. Similarly, if P is a probability distribution on the reals and φ is a test function, then ∫φdP is a real number that continuously and linearly depends on φ: probability distributions can thus also be viewed as continuous linear functionals on the space of test functions. This notion of "continuous linear functional on the space of test functions" is therefore used as the definition of a distribution.

Such distributions may be multiplied with real numbers and can by added together, so they form a real vector space. In general it is not possible to define a multiplication for distributions, but distributions may be multiplied with infinitely often differentiable functions.

To define the derivative of a distribution, we first consider the case of a differentiable and integrable function f : RR. If φ is a test function, then we have

\int_{\mathbf{R}}{}{f'\phi \,dx} = - \int_{\mathbf{R}}{}{f\phi' \,dx}

using integration by parts (note that φ is zero outside of a bounded set and that therefore no boundary values have to be taken into account). This suggests that if S is a distribution, we should define its derivative S' as the linear functional which sends the test function φ to -S(φ'). It turns out that this is the proper definition; it extends the ordinary definition of derivative, every distribution becomes infinitely often differentiable and the usual properties of derivatives hold.

The Dirac delta (so-called Dirac delta function) is the distribution which sends the test function φ to φ(0). It is the derivative of the Heaviside step function H(x) = 0 if x < 0 and H(x) = 1 if x ≥ 0. The derivative of the Dirac delta is the distribution which sends the test function φ to -φ'(0). This latter distribution is our first example of a distribution which is neither a function nor a probability distribution.

An alternate definition is the limit of a sequence of functions. For instance the delta function is given by

\delta (x) = \lim_{a \to 0} \delta_a(x)

where δa(x) is 1/(2a) if x is between -a and a, and is 0 otherwise.

Formal definition

In the sequel, real-valued distributions on an open subset U of Rn will be formally defined. (With minor modifications, one can also define complex-valued distributions, and one can replace Rn by any smooth manifold.) First, the space D(U) of test functions on U needs to be explained. A function φ : UR is said to have compact support if there exists a compact subset K of U such that φ(x) = 0 for all x in U \ K. The elements of D(U) are the infinitely often differentiable functions φ : UR with compact support. This is a real vector space. We turn it into a topological vector space by requiring that a sequence (or net) (φk) converges to 0 if and only if there exists a compact subset K of U such that all φk are identically zero outside K, and for every ε > 0 and natural number d ≥ 0 there exists a natural number k0 such that for all kk0 the absolute value of all d-th derivatives of φk is smaller than ε. With this definition, D(U) becomes a complete topological vector space (in fact, a so-called LF-space ).

The dual space of the topological vector space D(U), consisting of all continuous linear functionals S : D(U) → R, is the space of all distributions on U; it is a vector space and is denoted by D'(U).

The function f : UR is called locally integrable if it is Lebesgue integrable over every compact subset K of U. This is a large class of functions which includes all continuous functions. The topology on D(U) is defined in such a fashion that any locally integrable function f yields a continuous linear functional on D(U) whose value on the test function φ is given by the Lebesgue integral ∫U fφ dx. Two locally integrable functions f and g yield the same element of D(U) if and only if they are equal almost everywhere. Similarly, every Radon measure μ on U (which includes the probability distributions) defines an element of D'(U) whose value on the test function φ is ∫φ dμ.

As mentioned above, integration by parts suggests that the derivative dS/dx of the distribution S in direction x should be defined using the formula

dS / dx (φ) = - S (dφ / dx)

for all test functions φ. In this way, every distribution is infinitely often differentiable, and the derivative in direction x is a linear operator on D'(U).

The space D'(U) is turned into a locally convex topological vector space by defining that the sequence (Sk) converges towards 0 if and only if Sk(φ) → 0 for all test functions φ; this topology is called the strong (operator) topology. This is the case if and only if Sk converges uniformly to 0 on all bounded subsets of D(U). (A subset of E of D(U) is bounded if there exists a compact subset K of U and numbers dn such that every φ in E has its support in K and has its n-th derivatives bounded by dn.) With respect to this topology, differentiation of distributions is a continuous operator; this is an important and desirable property that is not shared by most other notions of differentiation. Furthermore, the test functions (which can themselves be viewed as distributions) are dense in D'(U) with respect to this topology.

If ψ : UR is an infinitely often differentiable function and S is a distribution on U, we define the product Sψ by (Sψ)(φ) = S(ψφ) for all test functions φ. The ordinary product rule of calculus remains valid.

Compact support and convolution

We say that a distribution S has compact support if there is a compact subset K of U such that for every test function φ whose support is completely outside of K, we have S(φ) = 0. Alternatively, one may define distributions with compact support as continuous linear functionals on the space C(U); the topology on C(U) is defined such that φk converges to 0 if and only if all derivatives of φk converge uniformly to 0 on every compact subset of U.

If both S and T are distributions on Rn and one of them has compact support, then one can define a new distribution, the convolution S*T of S and T, as follows: if φ is a test function in D(Rn) and x, y elements of Rn, write φx(y) = φ(x + y), ψ(x) = Tx) and (S*T)(φ) = S(ψ). This generalizes the classical notion of convolution of functions and is compatible with differentiation in the following sense:

d/dx (S * T) = (d/dx S) * T = S * (d/dx T).

Tempered distributions and Fourier transform

By using a larger space of test functions, one can define the tempered distributions, a subspace of D'(Rn). These distributions are useful if one studies the Fourier transform in generality: all tempered distributions have a Fourier transform, but not all distributions have one.

The space of test functions employed here, the so-called Schwartz-space, is the space of all infinitely differentiable rapidly decreasing functions, where φ : RnR is called rapidly decreasing if any derivative of φ, multiplied with any power of |x|, converges towards 0 for |x| → ∞. These functions form a complete topological vector space with a suitably defined family of seminorms. More precisely, let

p_{\alpha, \beta} (\phi) = \sup_{x \in \mathbb{R}^n} | x^\alpha D^\beta \phi(x)|

for α, β multi-indices of size n. Then φ is rapidly-decreasing if all the values

p_{\alpha, \beta} (\phi) < \infty

The family of seminorms pα, β defines a locally convex topology on the Schwartz-space. It is metrizable and complete.

The derivative of a tempered distribution is again a tempered distribution. Tempered distributions generalize the bounded (or slow-growing) locally integrable functions; all distributions with compact support and all square-integrable functions can be viewed as tempered distributions.

To study the Fourier transform, it is best to consider complex-valued test functions and complex-linear distributions. The ordinary continuous Fourier transform F yields then an automorphism of Schwartz-space, and we can define the Fourier transform of the tempered distribution S by (FS)(φ) = S(Fφ) for every test function φ. FS is thus again a tempered distribution. The Fourier transform is a continuous, linear, bijective operator from the space of tempered distributions to itself. This operation is compatible with differentiation in the sense that

F (d/dx S) = ix FS

and also with convolution: if S is a tempered distribution and ψ is a slowly increasing infinitely often differentiable function on Rn (meaning that all derivatives of ψ grow at most as fast as polynomials), then Sψ is again a tempered distribution and

F(Sψ) = FS * Fψ.

Using holomorphic functions as test functions

The success of the theory led to investigation of the idea of hyperfunction, in which spaces of holomorphic functions are used as test functions. A refined theory has been developed, in particular by Mikio Sato, using sheaf theory and several complex variables. This extends the range of symbolic methods that can be made into rigorous mathematics, for example Feynman integrals.

See also Colombeau algebra.

References

M. J. Lighthill (1958). Introduction to Fourier Analysis and Generalized Functions. Cambridge et. al.: Cambridge University Press. ISBN 0-521-09128-4 (defines distributions as limits of sequences of functions under integrals)

Last updated: 10-25-2005 00:41:25
The contents of this article are licensed from Wikipedia.org under the GNU Free Documentation License. How to see transparent copy