Orthogonal polynomials

From Wikipedia, the free encyclopedia

In mathematics, an orthogonal polynomial sequence is an infinite sequence of polynomials p0(x), p1(x), p2(x), ... , in which each pn(x) has degree n, and such that any two different polynomials in the sequence are orthogonal to each other in the following sense:

One can define an inner product on functions, (analogous to the ordinary "dot product" for vectors), by integrating the product of the functions:
\langle f,g \rangle=\int_{x_1}^{x_2} f(x)g(x)\,dx
More generally, one can put a fixed "weight function" W(x) into the integral:
\langle f,g \rangle=\int_{x_1}^{x_2} f(x)g(x)W(x)\,dx
Two functions are orthogonal to each other if their inner product is zero, in the same way that ordinary vectors are orthogonal (perpendicular) if their dot product is zero.
Such an inner product makes the set of all functions of finite norm a Hilbert space.

So a polynomial sequence is an orthogonal sequence with respect to the weight function W when any two different polynomials in the sequence are orthogonal, using that weight function, i.e.,

\langle p_m, p_n \rangle=\int_{x_1}^{x_2} p_m(x) p_n(x)\,W(x)\,dx=0\qquad \mathrm{whenever}\qquad m\neq n.

The interval of integration is called the interval of orthogonality. It might be infinite at one or both ends.

The field of orthogonal polynomials developed in the late 19th century from a study of continued fractions by Stieltjes. It evolved into a field rich in applications to many areas of mathematics and physics.

The simplest orthogonal polynomials are the Legendre polynomials, for which the interval of orthogonality is [−1, 1] and the weight function is simply 1:

P_0(x) = 1\,
P_1(x) = x\,
P_2(x) = \frac{3x^2-1}{2}\,
P_3(x) = \frac{5x^3-3x}{2}\,
P_4(x) = \frac{35x^4-30x^2+3}{8}\,
\vdots

These are all orthogonal over [−1, 1]:

\int_{-1}^{1} P_m(x)P_n(x)\,dx = 0\qquad \mathrm{whenever}\qquad m \ne n

We require that the weight function be strictly positive in the interior of the interval of orthogonality. In some cases, it may be zero, or go off to infinity, at the end points. The integral of the weight function times any polynomial must be finite.

Now any sequence of polynomials p_0, p_1, \dots, with each pk having degree k, is a basis for the (infinite-dimensional) vector space of all polynomials. An orthogonal sequence is just a sequence that comprises an orthogonal basis for that space, relative to the given inner product.

The Gram-Schmidt process can turn any basis for a vector space into an orthogonal basis, by starting with one vector and then repeatedly incorporating new vectors while making each new vector orthogonal to all the previous ones. This is done by subtracting suitable linear combinations of the previous vectors. Doing this for polynomials is often used as an exercise in elementary linear algebra courses. It results in the Legendre polynomials.

When making an orthogonal basis, one may be tempted to make an orthonormal basis, that is, one in which \langle p_n, p_n \rangle\ =\ 1. For polynomials, this would often result in ugly square roots in the coefficients. Instead, polynomials are often scaled in a way that mathematicians agree on, that makes the coefficients and other formulas simpler. This is called standardization. The "classical" polynomials listed below have been standardized, typically by setting their leading coefficients to some specific quantity, or by setting a specific value for the polynomial. This standardization has no mathematical significance; it is just a convention. Standardization also involves scaling the weight function in an agreed-upon way.

Once a polynomial sequence has been standardized, we can define the norm. Let

h_n=\langle p_n,\ p_n \rangle.

The norm is the square root of this. The values of \ h_n for the standardized classical polynomials will be listed in the table below. Using \ h_n, we have

\langle p_m,\ p_n \rangle\ =\ \delta_{mn}h_n

where δmn is the Kronecker delta.

Contents

[edit] General properties of orthogonal polynomial sequences

All orthogonal polynomial sequences have a number of elegant and fascinating properties. Before proceeding with them:

Lemma 1: Given an orthogonal polynomial sequence \ p_i(x), any nth-degree polynomial S(x) can be expanded in terms of p_0, \dots, p_n. That is, there are coefficients {\alpha}_0, \dots, {\alpha}_n such that

S(x)=\sum_{i=0}^n {\alpha}_i\ p_i(x).

Proof by mathematical induction. Choose \ {\alpha}_n so that the \ x^n term of S(x) matches that of \ {\alpha}_nP_n(x). Then \ S(x)-{\alpha}_n P_n(x) is an (n − 1)th-degree polynomial. Continue downward.

Lemma 2: Given an orthogonal polynomial sequence, each of its polynomials is orthogonal to any polynomial of strictly lower degree.

Proof: Given n, any polynomial of degree n − 1 or lower can be expanded in terms of p_0, \dots, p_{n-1}. pn is orthogonal to each of them.

[edit] Recurrence relations

Any orthogonal sequence has a recurrence formula relating any three consecutive polynomials in the sequence:

p_{n+1}\ =\ (a_nx+b_n)\ p_n\ -\ c_n\ p_{n-1}.

The coefficients a, b, and c depend on n. They also depend on the standardization, obviously. (proof)

The values of an, bn and cn can be worked out directly. Let kj and kj' be the first and second coefficients of pj:

p_j(x)=k_jx^j+k_j'x^{j-1}+\cdots

and hj be the inner product of pj with itself:

h_j\ =\ \langle p_j,\ p_j \rangle.

We have

a_n=\frac{k_{n+1}}{k_n},\qquad b_n=a_n \left(\frac{k_{n+1}'}{k_{n+1}} - \frac{k_n'}{k_n} \right), \qquad c_n=a_n \left(\frac{k_{n-1}h_n}{k_n h_{n-1}} \right).

[edit] Existence of real roots

Each polynomial in an orthogonal sequence has all n of its roots real, distinct, and strictly inside the interval of orthogonality. (proof)

(Anyone who has graphed polynomials in high school knows that it is very rare for a randomly-chosen high-degree polynomial to have all of its roots real.)

[edit] Interlacing of roots

The roots of each polynomial lie strictly between the roots of the next higher polynomial in the sequence. (proof)

[edit] Differential equations leading to orthogonal polynomials

A very important class of orthogonal polynomials arises from a differential equation of the form

{Q(x)}\,f'' + {L(x)}\,f' + {\lambda}f = 0\,

where Q is a given quadratic (at most) polynomial, and L is a given linear polynomial. The function f, and the constant λ, are to be found.

(Note that it makes sense for such an equation to have a polynomial solution.
Each term in the equation is a polynomial, and the degrees are consistent.)

This is a Sturm-Liouville type of equation. Such equations generally have singularities in their solution functions f except for particular values of λ. They can be thought of a eigenvector/eigenvalue problems: Letting D be the differential operator, D(f) = Q f'' + L f'\,, and changing the sign of λ, the problem is to find the eigenvectors (eigenfunctions) f, and the corresponding eigenvalues λ, such that f does not have singularities and D(f) = λf.

The solutions of this differential equation have singularities unless λ takes on specific values. There is a series of numbers {\lambda}_0, {\lambda}_1, {\lambda}_2, \dots\, that lead to a series of polynomial solutions P_0, P_1, P_2, \dots\, if one of the following sets of conditions are met:

  1. Q is actually quadratic, L is linear, Q has two distinct real roots, the root of L lies strictly between the roots of Q, and the leading terms of Q and L have the same sign.
  2. Q is not actually quadratic, but is linear, L is linear, the roots of Q and L are different, and the leading terms of Q and L have the same sign if the root of L is less than the root of Q, or vice-versa.
  3. Q is just a nonzero constant, L is linear, and the leading term of L has the opposite sign of Q.

These three cases lead to the Jacobi-like, Laguerre-like, and Hermite-like polynomials, respectively.

In each of these three cases, we have the following:

  • The solutions are a series of polynomials P_0, P_1, P_2, \dots\,, each P_n\, having degree n, and corresponding to a number {\lambda}_n\,.
  • The interval of orthogonality is bounded by whatever roots Q has.
  • The root of L is inside the interval of orthogonality.
  • Letting R(x) = e^{\int \frac{L(x)}{Q(x)}\,dx}\,, the polynomials are orthogonal under the weight function W(x) =\frac{R(x)}{Q(x)}\,
  • W(x) has no zeros or infinities inside the interval, though it may have zeros or infinities at the end points.
  • W(x) gives a finite inner product to any polynomials.
  • W(x) can be made to be greater than 0 in the interval. (Negate the entire differential equation if necessary so that Q(x) > 0 inside the interval.)

Because of the constant of integration, the quantity R(x) is determined only up to an arbitrary positive multiplicative constant. It will be used only in homogeneous differential equations (where this doesn't matter) and in the definition of the weight function (which can also be indeterminate.) The tables below will give the "official" values of R(x) and W(x).

[edit] Rodrigues' formula

Under the assumptions of the preceding section, Pn(x) is proportional to \frac{1}{W(x)} \  \frac{d^n}{dx^n}\left(W(x)[Q(x)]^n\right).

This is known as Rodrigues' formula. It is often written

P_n(x) = \frac{1}{{e_n}W(x)} \  \frac{d^n}{dx^n}\left(W(x)[Q(x)]^n\right)

where the numbers en depend on the standardization. The standard values of en will be given in the tables below.

[edit] The numbers λn

Under the assumptions of the preceding section, we have

{\lambda}_n = - n \left( \frac{n-1}{2}\ Q'' + L' \right).

(Since Q is quadratic and L is linear, Q'' and L' are constants, so these are just numbers.)

[edit] Second form for the differential equation

Let R(x) = e^{\int \frac{L(x)}{Q(x)}\,dx}\,.

Then

(Ry')' = R\,y'' + R'\,y' = R\,y'' + \frac{R\,L}{Q}\,y'.

Now multiply the differential equation

{Q}\,y'' + {L}\,y' + {\lambda}\,y = 0\,

by R/Q, getting

R\,y'' + \frac{R\,L}{Q}\,y' + \frac{R\,\lambda}{Q}\,y = 0\,

or

(Ry')' + \frac{R\,\lambda}{Q}\,y = 0.\,

This is the standard Sturm-Liouville form for the equation.

[edit] Third form for the differential equation

Let S(x) = \sqrt{R(x)} = e^{\int \frac{L(x)}{2\,Q(x)}\,dx}\,.

Then

S' = \frac{S\,L}{2\,Q}.

Now multiply the differential equation

{Q}\,y'' + {L}\,y' + {\lambda}\,y = 0\,

by S/Q, getting

S\,y'' + \frac{S\,L}{Q}\,y' + \frac{S\,\lambda}{Q}\,y = 0\,

or

S\,y'' + 2\,S'\,y' + \frac{S\,\lambda}{Q}\,y = 0\,

But (S\,y)'' = S\,y'' + 2\,S'\,y' + S''\,y, so

(S\,y)'' + \left(\frac{S\,\lambda}{Q} - S''\right)\,y = 0,\,

or, letting u = Sy,

u'' + \left(\frac{\lambda}{Q} - \frac{S''}{S}\right)\,u = 0.\,

[edit] Formulas involving derivatives

Under the assumptions of the preceding section, let P_n^{[r]} denote the rth derivative of Pn. (We put the "r" in brackets to avoid confusion with an exponent.) P_n^{[r]} is a polynomial of degree n − r. Then we have the following:

  • (orthogonality) For fixed r, the polynomial sequence P_r^{[r]}, P_{r+1}^{[r]}, P_{r+2}^{[r]}, \dots are orthogonal, weighted by WQ^r\,.
  • (generalized Rodrigues' formula) P_n^{[r]} is proportional to \frac{1}{W(x)[Q(x)]^r} \  \frac{d^{n-r}}{dx^{n-r}}\left(W(x)[Q(x)]^n\right).
  • (differential equation) P_n^{[r]} is a solution of {Q}\,y'' + (rQ'+L)\,y' + [{\lambda}_n-{\lambda}_r]\,y = 0\,, where {\lambda}_r\, is the same function as {\lambda}_n\,, that is, {\lambda}_r = - r \left( \frac{r-1}{2}\ Q'' + L' \right)
  • (differential equation, second form) P_n^{[r]} is a solution of (RQ^{r}y')' + [{\lambda}_n-{\lambda}_r]RQ^{r-1}\,y = 0\,

There are also some mixed recurrences. In each of these, the numbers a, b, and c depend on n and r, and are unrelated in the various formulas.

  • P_n^{[r]} = aP_{n+1}^{[r+1]} + bP_n^{[r+1]} + cP_{n-1}^{[r+1]}
  • P_n^{[r]} = (ax+b)P_n^{[r+1]} + cP_{n-1}^{[r+1]}
  • QP_n^{[r+1]} = (ax+b)P_n^{[r]} + cP_{n-1}^{[r]}

There are an enormous number of other formulas involving orthogonal polynomials in various ways. Here is a tiny sample of them, relating to the Chebyshev, associated Laguerre, and Hermite polynomials:

  • 2\,T_{m}(x)\,T_{n}(x) = T_{m+n}(x) + T_{m-n}(x)\,
  • H_{2n}(x) = (-4)^{n}\,n!\,L_{n}^{(-1/2)}(x^2)
  • H_{2n+1}(x) = 2(-4)^{n}\,n!\,x\,L_{n}^{(1/2)}(x^2)

[edit] The classical orthogonal polynomials

The class of polynomials arising from the differential equation described above have many important applications in such areas as mathematical physics, interpolation theory, the theory of random matrices, computer approximations, and many others. All of these polynomial sequences are equivalent, under scaling and/or shifting of the domain, and standardizing of the polynomials, to more restricted classes. Those restricted classes are the "classical orthogonal polynomials".

  • Every Jacobi-like polynomial sequence can have its domain shifted and/or scaled so that its interval of orthogonality is [−1, 1], and has Q = 1 − x2. They can then be standardized into the Jacobi polynomials P_n^{(\alpha, \beta)}. There are several important subclasses of these: Gegenbauer, Legendre, and two types of Chebyshev.
  • Every Laguerre-like polynomial sequence can have its domain shifted, scaled, and/or reflected so that its interval of orthogonality is [0, \infty), and has Q = x. They can then be standardized into the Associated Laguerre polynomials L_n^{(\alpha)}. The plain Laguerre polynomials \ L_n are a subclass of these.
  • Every Hermite-like polynomial sequence can have its domain shifted and/or scaled so that its interval of orthogonality is (-\infty, \infty), and has Q = 1. They can then be standardized into the Hermite polynomials H_n\,.

Because all polynomial sequences arising from a differential equation in the manner described above are trivially equivalent to the classical polynomials, the actual classical polynomials are always used.

[edit] Jacobi polynomials

The Jacobi-like polynomials, once they have had their domain shifted and scaled so that the interval of orthogonality is [−1, 1], still have two parameters to be determined. They are α and β in the Jacobi polynomials, written P_n^{(\alpha, \beta)}. We have Q(x) = 1-x^2\, and L(x) = \beta-\alpha-(\alpha+\beta+2)\, x. Both α and β are required to be greater than −1. (This puts the root of L inside the interval of orthogonality.)

When α and β are not equal, these polynomials are not symmetrical about x = 0.

The differential equation

(1-x^2)\,y'' + (\beta-\alpha-[\alpha+\beta+2]\,x)\,y' + {\lambda}\,y = 0\qquad \mathrm{with}\qquad\lambda = n(n+1+\alpha+\beta)\,

is Jacobi's equation.

For further details, see Jacobi polynomials.

[edit] Gegenbauer polynomials

When one sets the parameters α and β in the Jacobi polynomials equal to each other, one obtains the Gegenbauer or ultraspherical polynomials. They are written C_n^{(\alpha)}, and defined as

C_n^{(\alpha)}(x) = \frac{\Gamma(2\alpha\!+\!n)\,\Gamma(\alpha\!+\!1/2)} {\Gamma(2\alpha)\,\Gamma(\alpha\!+\!n\!+\!1/2)}\! \  P_n^{(\alpha-1/2, \alpha-1/2)}.

We have Q(x) = 1-x^2\, and L(x) = -(2\alpha+1)\, x. \alpha\, is required to be greater than −1/2.

(Incidentally, the standardization given in the table below would make no sense for α = 0 and n ǂ 0, because it would set the polynomials to zero. In that case, the accepted standardization sets C_n^{(0)}(1) = \frac{2}{n} instead of the value given in the table.)

Ignoring the above considerations, the parameter α is closely related to the derivatives of C_n^{(\alpha)}:

C_n^{(\alpha+1)}(x) = \frac{1}{2\alpha}\! \  \frac{d}{dx}C_{n+1}^{(\alpha)}(x)

or, more generally:

C_n^{(\alpha+m)}(x) = \frac{\Gamma(\alpha)}{2^m\Gamma(\alpha+m)}\! \  C_{n+m}^{(\alpha)[m]}(x).

All the other classical Jacobi-like polynomials (Legendre, etc.) are special cases of the Gegenbauer polynomials, obtained by choosing a value of α and choosing a standardization.

For further details, see Gegenbauer polynomials.

[edit] Legendre polynomials

The differential equation is

(1-x^2)\,y'' - 2x\,y' + {\lambda}\,y = 0\qquad \mathrm{with}\qquad\lambda = n(n+1).\,

This is Legendre's equation.

The second form of the differential equation is

([1-x^2]\,y')' + \lambda\,y = 0.\,

The recurrence relation is

(n+1)\,P_{n+1}(x) = (2n+1)x\,P_n(x) - n\,P_{n-1}(x).\,

A mixed recurrence is

P_{n+1}^{[r+1]}(x) = P_{n-1}^{[r+1]}(x) + (2n+1)\,P_n^{[r]}(x).\,

Rodrigues' formula is

P_n(x) = (-1)^n\,\frac{1}{2^n\,n!} \  \frac{d^n}{dx^n}\left([1-x^2]^n\right).

For further details, see Legendre polynomials.

[edit] Associated Legendre polynomials

The Associated Legendre polynomials, denoted P_\ell^{(m)}(x) where \ell and m are integers with 0{\le}m{\le}\ell, are defined as

P_\ell^{(m)}(x) = (-1)^m\,(1-x^2)^{m/2}\ P_\ell^{[m]}(x).\,

The m in parentheses (to avoid confusion with an exponent) is a parameter. The m in brackets denotes the mth derivative of the Legendre polynomial.

These "polynomials" are misnamed -- they are not polynomials when m is odd.

They have a recurrence relation:

(\ell+1-m)\,P_{\ell+1}^{(m)}(x) = (2\ell+1)x\,P_\ell^{(m)}(x) - (\ell+m)\,P_{\ell-1}^{(m)}(x)\,

For fixed m, the sequence P_m^{(m)}, P_{m+1}^{(m)}, P_{m+2}^{(m)}, \dots are orthogonal over [−1, 1], with weight 1.

For given m, P_\ell^{(m)}(x) are the solutions of

(1-x^2)\,y'' -2xy' + [\lambda - \frac{m^2}{1-x^2}]\,y = 0\qquad \mathrm{with}\qquad\lambda = \ell(\ell+1)\,

[edit] Chebyshev polynomials

The differential equation is

(1-x^2)\,y'' - x\,y' + {\lambda}\,y = 0\qquad \mathrm{with}\qquad\lambda = n^2.\,

This is Chebyshev's equation.

The recurrence relation is

T_{n+1}(x) = 2x\,T_n(x) - T_{n-1}(x).\,

Rodrigues' formula is

T_n(x) = \frac{\Gamma(1/2)\sqrt{1-x^2}}{(-2)^n\,\Gamma(n+1/2)} \  \frac{d^n}{dx^n}\left([1-x^2]^{n-1/2}\right).

These polynomials have the property that, in the interval of orthogonality,

T_n(x) = \cos(n\,\cos^{-1}(x)).

(To prove it, use the recurrence formula.)

This means that all their local minima and maxima have values of −1 and +1, that is, the polynomials are "level". Because of this, expansion of functions in terms of Chebyshev polynomials is sometimes used for polynomial approximations in computer math libraries.

Some authors use versions of these polynomials that have been shifted so that the interval of orthogonality is [0, 1] or [−2, 2].

There are also Chebyshev polynomials of the second kind, denoted U_n\,

We have:

U_n = \frac{1}{n+1}\,T_{n+1}'.\,

For further details, including the expressions for the first few polynomials, see Chebyshev polynomials.

[edit] Laguerre polynomials

The most general Laguerre-like polynomials, after the domain has been shifted and scaled, are the Associated Laguerre polynomials (also called Generalized Laguerre polynomials), denoted L_n^{(\alpha)}. There is a parameter α, which can be any real number strictly greater than −1. The parameter is put in parentheses to avoid confusion with an exponent. The plain Laguerre polynomials are simply the α = 0 version of these:

L_n(x) = L_n^{(0)}(x).\,

The differential equation is

x\,y'' + (\alpha + 1-x)\,y' + {\lambda}\,y = 0\qquad \mathrm{with}\qquad\lambda = n.\,

This is Laguerre's equation.

The second form of the differential equation is

(x^{\alpha+1}\,e^{-x}\, y')' + {\lambda}\,x^{\alpha}\,e^{-x}\,y = 0.\,

The recurrence relation is

(n+1)\,L_{n+1}^{(\alpha)}(x) = (2n+1+\alpha-x)\,L_n^{(\alpha)}(x) - (n+\alpha)\,L_{n-1}^{(\alpha)}(x).\,

Rodrigues' formula is

L_n^{(\alpha)}(x) = \frac{x^{-\alpha}e^x}{n!} \  \frac{d^n}{dx^n}\left(x^{n+\alpha}\,e^{-x}\right).

The parameter α is closely related to the derivatives of L_n^{(\alpha)}:

L_n^{(\alpha+1)}(x) = - \frac{d}{dx}L_{n+1}^{(\alpha)}(x)

or, more generally:

L_n^{(\alpha+m)}(x) = (-1)^m L_{n+m}^{(\alpha)[m]}(x).

Laguerre's equation can be manipulated into a form that is more useful in applications:

u = x^{\frac{\alpha-1}{2}}e^{-x/2}L_n^{(\alpha)}(x)

is a solution of

u'' + \frac{2}{x}\,u' + \left[\frac{\lambda}{x} - \frac{1}{4} - \frac{\alpha^2-1}{4x^2}\right]\,u = 0\qquad \mathrm{with}\qquad\lambda = n+\frac{\alpha+1}{2}.\,

This can be further manipulated. When \ell = \frac{\alpha-1}{2} is an integer, and n{\ge}\ell+1:

u = x^{\ell}e^{-x/2}L_{n-\ell-1}^{(2\ell+1)}(x)

is a solution of

u'' + \frac{2}{x}\,u' + \left[\frac{\lambda}{x} - \frac{1}{4} - \frac{\ell(\ell+1)}{x^2}\right]\,u = 0\qquad \mathrm{with}\qquad\lambda = n.\,

The solution is often expressed in terms of derivatives instead of associated Laguerre polynomials:

u = x^{\ell}e^{-x/2}L_{n+\ell}^{[2\ell+1]}(x).

This equation arises in quantum mechanics, in the radial part of the solution of the Schrödinger equation for a one-electron atom.

Physicists often use a definition for the Laguerre polynomials that is larger, by a factor of (n!), than the definition used here.

For further details, including the expressions for the first few polynomials, see Laguerre polynomials.

[edit] Hermite polynomials

The differential equation is

y'' - 2xy' + {\lambda}\,y = 0,\qquad \mathrm{with}\qquad\lambda = 2n.\,

This is Hermite's equation.

The second form of the differential equation is

(e^{-x^2}\,y')' + e^{-x^2}\,\lambda\,y = 0.\,

The third form is

(e^{-x^2/2}\,y)'' + ({\lambda}+1-x^2)(e^{-x^2/2}\,y) = 0.\,

The recurrence relation is

H_{n+1}(x) = 2x\,H_n(x) - 2n\,H_{n-1}(x).\,

Rodrigues' formula is

H_n(x) = (-1)^n\,e^{x^2} \  \frac{d^n}{dx^n}\left(e^{-x^2}\right).

The first few Hermite polynomials are

H_0(x) = 1\,
H_1(x) = 2x\,
H_2(x) = 4x^2-2\,
H_3(x) = 8x^3-12x\,
H_4(x) = 16x^4-48x^2+12\,

One can define the associated Hermite functions

{\psi}_n(x) = (h_n)^{-1/2}\,e^{-x^2/2}H_n(x).\,

Because the multiplier is proportional to the square root of the weight function, these functions are orthogonal over (-\infty, \infty) with no weight function.

The third form of the differential equation above, for the associated Hermite functions, is

\psi'' + ({\lambda}+1-x^2)\psi = 0.\,

The associated Hermite functions arise in many areas of mathematics and physics. In quantum mechanics, they are the solutions of Schrödinger's equation for the harmonic oscillator. They are also eigenfunctions (with eigenvalue (−i)n) of the continuous Fourier transform.

Some authors, particularly probabilists, use an alternate definition of the Hermite polynomials, with a weight function of e^{-x^2/2} instead of e^{-x^2}. This is generally named with the two-letter symbol He\,. It could be defined as

He_n(x) = 2^{-n/2}\,H_n\left(\frac{x}{\sqrt{2}}\right).

For further details, see Hermite polynomials.

[edit] Table of classical orthogonal polynomials

Name, and conventional symbol Chebyshev, \ T_n Chebyshev
(second kind), \ U_n
Legendre, \ P_n Hermite, \ H_n
Limits of orthogonality -1, 1\, -1, 1\, -1, 1\, -\infty, \infty
Weight, W(x)\, (1-x^2)^{-1/2}\, (1-x^2)^{1/2}\, 1\, e^{-x^2}
Standardization T_n(1)=1\, U_n(1)=n+1\, P_n(1)=1\, Lead term = 2^n\,
Square of norm, h_n\, \left\{ \begin{matrix} \pi   &:~n=0 \\ \pi/2 &:~n\ne 0 \end{matrix}\right. \pi/2\, \frac{2}{2n+1} 2^n\,n!\,\sqrt{\pi}
Leading term, k_n\, 2^{n-1}\, 2^n\, \frac{(2n)!}{2^n\,(n!)^2}\, 2^n\,
Second term, k'_n\, 0\, 0\, 0\, 0\,
Q\, 1-x^2\, 1-x^2\, 1-x^2\, 1\,
L\, -x\, -3x\, -2x\, -2x\,
R(x) =e^{\int \frac{L(x)}{Q(x)}\,dx} (1-x^2)^{1/2}\, (1-x^2)^{3/2}\, 1-x^2\, e^{-x^2}\,
Constant in diff. equation, {\lambda}_n\, n^2\, n(n+2)\, n(n+1)\, 2n\,
Constant in Rodrigues' formula, e_n\, (-2)^n\,\frac{\Gamma(n+1/2)}{\sqrt{\pi}}\, 2(-2)^n\,\frac{\Gamma(n+3/2)}{(n+1)\,\sqrt{\pi}}\, (-2)^n\,n!\, (-1)^n\,
Recurrence relation, a_n\, 2\, 2\, \frac{2n+1}{n+1}\, 2\,
Recurrence relation, b_n\, 0\, 0\, 0\, 0\,
Recurrence relation, c_n\, 1\, 1\, \frac{n}{n+1}\, 2n\,
Name, and conventional symbol Associated Laguerre, L_n^{(\alpha)} Laguerre, \ L_n
Limits of orthogonality 0, \infty\, 0, \infty\,
Weight, W(x)\, x^{\alpha}e^{-x}\, e^{-x}\,
Standardization Lead term = \frac{(-1)^n}{n!}\, Lead term = \frac{(-1)^n}{n!}\,
Square of norm, h_n\, \frac{\Gamma(n+\alpha+1)}{n!}\, 1\,
Leading term, k_n\, \frac{(-1)^n}{n!}\, \frac{(-1)^n}{n!}\,
Second term, k'_n\, \frac{(-1)^{n+1}(n+\alpha)}{(n-1)!}\, \frac{(-1)^{n+1}n}{(n-1)!}\,
Q\, x\, x\,
L\, \alpha+1-x\, 1-x\,
R(x) =e^{\int \frac{L(x)}{Q(x)}\,dx} x^{\alpha+1}\,e^{-x}\, x\,e^{-x}\,
Constant in diff. equation, {\lambda}_n\, n\, n\,
Constant in Rodrigues' formula, e_n\, n!\, n!\,
Recurrence relation, a_n\, \frac{-1}{n+1}\, \frac{-1}{n+1}\,
Recurrence relation, b_n\, \frac{2n+1+\alpha}{n+1}\, \frac{2n+1}{n+1}\,
Recurrence relation, c_n\, \frac{n+\alpha}{n+1}\, \frac{n}{n+1}\,
Name, and conventional symbol Gegenbauer, C_n^{(\alpha)} Jacobi, P_n^{(\alpha, \beta)}
Limits of orthogonality -1, 1\, -1, 1\,
Weight, W(x)\, (1-x^2)^{\alpha-1/2}\, (1-x)^\alpha(1+x)^\beta\,
Standardization C_n^{(\alpha)}(1)=\frac{\Gamma(n+2\alpha)}{n!\,\Gamma(2\alpha)}\, if \alpha\ne0 P_n^{(\alpha, \beta)}(1)=\frac{\Gamma(n+1+\alpha)}{n!\,\Gamma(1+\alpha)}\,
Square of norm, h_n\, \frac{\pi\,2^{1-2\alpha}\Gamma(n+2\alpha)}{n!(n+\alpha)(\Gamma(\alpha))^2} \frac{2^{\alpha+\beta+1}\,\Gamma(n\!+\!\alpha\!+\!1)\,\Gamma(n\!+\!\beta\!+\!1)} {n!(2n\!+\!\alpha\!+\!\beta\!+\!1)\Gamma(n\!+\!\alpha\!+\!\beta\!+\!1)}
Leading term, k_n\, \frac{\Gamma(2n+2\alpha)\Gamma(1/2+\alpha)}{n!\,2^n\,\Gamma(2\alpha)\Gamma(n+1/2+\alpha)}\, \frac{\Gamma(2n+1+\alpha+\beta)}{n!\,2^n\,\Gamma(n+1+\alpha+\beta)}\,
Second term, k'_n\, 0\, \frac{(\alpha-\beta)\,\Gamma(2n+\alpha+\beta)}{(n-1)!\,2^n\,\Gamma(n+1+\alpha+\beta)}\,
Q\, 1-x^2\, 1-x^2\,
L\, -(2\alpha+1)\,x\, \beta-\alpha-(\alpha+\beta+2)\,x\,
R(x) =e^{\int \frac{L(x)}{Q(x)}\,dx} (1-x^2)^{\alpha+1/2}\, (1-x)^{\alpha+1}(1+x)^{\beta+1}\,
Constant in diff. equation, {\lambda}_n\, n(n+2\alpha)\, n(n+1+\alpha+\beta)\,
Constant in Rodrigues' formula, e_n\, \frac{(-2)^n\,n!\,\Gamma(2\alpha)\,\Gamma(n\!+\!1/2\!+\!\alpha)} {\Gamma(n\!+\!2\alpha)\Gamma(\alpha\!+\!1/2)} (-2)^n\,n!\,
Recurrence relation, a_n\, \frac{2(n+\alpha)}{n+1}\, \frac{(2n+1+\alpha+\beta)(2n+2+\alpha+\beta)}{2(n+1)(n+1+\alpha+\beta)}
Recurrence relation, b_n\, 0\, \frac{({\alpha}^2-{\beta}^2)(2n+1+\alpha+\beta)}{2(n+1)(2n+\alpha+\beta)(n+1+\alpha+\beta)}
Recurrence relation, c_n\, \frac{n+2{\alpha}-1}{n+1}\, \frac{(n+\alpha)(n+\beta)(2n+2+\alpha+\beta)}{(n+1)(n+1+\alpha+\beta)(2n+\alpha+\beta)}

[edit] See also

[edit] References

  • Milton Abramowitz and Irene A. Stegun, eds. (1965). Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables. New York: Dover. ISBN 0-486-61272-4.  (See chapter 22)
  • Gabor Szego (1939). Orthogonal Polynomials. Colloquium Publications - American Mathematical Society. ISBN 0-8218-1023-5. 
  • Dunham Jackson (1941, 2004). Fourier Series and Orthogonal Polynomials. New York: Dover. ISBN 0-486-43808-2. 
  • Refaat El Attar (2006). Special Functions and Orthogonal Polynomials. Lulu Press, Morrisville NC 27560. ISBN 1-4116-6690-9. 
  • Theodore Seio Chihara (1978). An Introduction to Orthogonal Polynomials. Gordon and Breach, New York. ISBN 0-677-04150-0. 

[edit] Further reading

In other languages