Lagrange multipliers on Banach spaces

From Wikipedia, the free encyclopedia

In the field of calculus of variations in mathematics, the method of Lagrange multipliers on Banach spaces can be used to solve certain infinite-dimensional constrained optimization problems. The method is a generalization of the classical method of Lagrange multipliers as used to find extrema of a function of finitely many variables.

Contents

[edit] The Lagrange multiplier theorem for Banach spaces

Let X and Y be real Banach spaces. Let U be an open subset of X and let f : UR be a continuously differentiable function. Let g : UY be another continuously differentiable function, the constraint: the objective is to find the extremal points (maxima or minima) of f subject to the constraint that g is zero.

Suppose that u0 is a constrained extremum of f, i.e. an extremum of f on

g^{-1} (0) = \{ x \in U | g(x) = 0 \in Y \} \subseteq U.

Suppose also that the Fréchet derivative Df(u0) : XR of f at u0 is a surjective linear map. Then there exists a Lagrange multiplier λ : YR in Y, the dual space to Y, such that

\mathrm{D} f (u_{0}) = \lambda \circ \mathrm{D} g (u_{0}). \quad \mbox{(L)}

Since Df(u0) is an element of the dual space X, equation (L) can also be written as

\mathrm{D} f (u_{0}) = \left( \mathrm{D} g (u_{0}) \right)^{*} (\lambda),

where (Dg(u0))(λ) is the pullback of λ by Dg(u0), i.e. the action of the adjoint map (Dg(u0)) on λ, as defined by

\left( \mathrm{D} g (u_{0}) \right)^{*} (\lambda) = \lambda \circ \mathrm{D} g (u_{0}).

[edit] Connection to the finite-dimensional case

In the case that X and Y are both finite-dimensional (i.e. linearly isomorphic to Rm and Rn for some natural numbers m and n) then writing out equation (L) in matrix form shows that λ is the usual Lagrange multiplier vector; in the case m = n = 1, λ is the usual Lagrange multiplier, a real number.

[edit] Application

In many optimization problems, one seeks to minimize a functional defined on an infinite-dimensional space such as a Banach space.

Consider, for example, the Sobolev space X = H01([−1, +1]; R) and the functional f : XR given by

f(u) = \int_{-1}^{+1} u'(x)^{2} \, \mathrm{d} x.

Without any constraint, the minimum value of f would be 0, attained by u0(x) = 0 for all x between −1 and +1. One could also consider the constrained optimization problem, to minimize f among all those uX such that the mean value of u is +1. In terms of the above theorem, the constraint g would be given by

g(u) = \frac{1}{2} \int_{-1}^{+1} u(x) \, \mathrm{d} x - 1.

The method of Lagrange multipliers on Banach spaces is required in order to solve this problem.

[edit] References

  • Zeidler, Eberhard (1995). Applied functional analysis: main principles and their applications, Applied Mathematical Sciences 109. New York, NY: Springer-Verlag. ISBN 0-387-94422-2. 

This article incorporates material from Lagrange multipliers on Banach spaces on PlanetMath, which is licensed under the GFDL.