Local martingale

In mathematics, a local martingale is a type of stochastic process, satisfying the localized version of the martingale property. Every martingale is a local martingale; every bounded local martingale is a martingale; in particular, every local martingale that is bounded from below is a supermartingale, and every local martingale that is bounded from above is a submartingale; however, in general a local martingale is not a martingale, because its expectation can be distorted by large values of small probability. In particular, a driftless diffusion process is a local martingale, but not necessarily a martingale.

Local martingales are essential in stochastic analysis, see Itō calculus, semimartingale, Girsanov theorem.

Definition

Let (Ω, F, P) be a probability space; let F = { Ft | t  0 } be a filtration of F; let X : [0, +∞) × Ω  S be an F-adapted stochastic process on set S. Then X is called an F-local martingale if there exists a sequence of F-stopping times τk : Ω  [0, +∞) such that

X_t^{\tau_{k}} := X_{\min \{ t, \tau_k \}}
is an F-martingale for every k.

Examples

Example 1

Let Wt be the Wiener process and T = min{ t : Wt = 1 } the time of first hit of 1. The stopped process Wmin{ t, T } is a martingale; its expectation is 0 at all times, nevertheless its limit (as t  ) is equal to 1 almost surely (a kind of gambler's ruin). A time change leads to a process

\displaystyle X_t = \begin{cases}
  W_{\min(\tfrac{t}{1-t},T)} &\text{for } 0 \le t < 1,\\
  -1 &\text{for } 1 \le t < \infty.
 \end{cases}

The process  X_t is continuous almost surely; nevertheless, its expectation is discontinuous,

\displaystyle \mathbb{E} X_t = \begin{cases}
  0 &\text{for } 0 \le t < 1,\\
  -1 &\text{for } 1 \le t < \infty.
 \end{cases}

This process is not a martingale. However, it is a local martingale. A localizing sequence may be chosen as  \tau_k = \min \{ t : X_t = k \} if there is such t, otherwise τk = k. This sequence diverges almost surely, since τk = k for all k large enough (namely, for all k that exceed the maximal value of the process X). The process stopped at τk is a martingale.[details 1]

Example 2

Let Wt be the Wiener process and ƒ a measurable function such that  \mathbb{E} |f(W_1)| < \infty. Then the following process is a martingale:

\displaystyle X_t = \mathbb{E} ( f(W_1) | F_t ) = \begin{cases}
  f_{1-t}(W_t) &\text{for } 0 \le t < 1,\\
  f(W_1) &\text{for } 1 \le t < \infty;
 \end{cases}

here

\displaystyle f_s(x) = \mathbb{E} f(x+W_s) = \int f(x+y) \frac1{\sqrt{2\pi s}} \mathrm{e}^{-y^2/(2s)} .

The Dirac delta function  \delta (strictly speaking, not a function), being used in place of  f, leads to a process defined informally as  Y_t = \mathbb{E} ( \delta(W_1) | F_t ) and formally as

\displaystyle Y_t = \begin{cases}
  \delta_{1-t}(W_t) &\text{for } 0 \le t < 1,\\
  0 &\text{for } 1 \le t < \infty,
 \end{cases}

where

\displaystyle \delta_s(x) = \frac1{\sqrt{2\pi s}} \mathrm{e}^{-x^2/(2s)} .

The process  Y_t is continuous almost surely (since  W_1 \ne 0 almost surely), nevertheless, its expectation is discontinuous,

\displaystyle \mathbb{E} Y_t = \begin{cases}
  1/\sqrt{2\pi} &\text{for } 0 \le t < 1,\\
  0 &\text{for } 1 \le t < \infty.
 \end{cases}

This process is not a martingale. However, it is a local martingale. A localizing sequence may be chosen as  \tau_k = \min \{ t : Y_t = k \}.

Example 3

Let  Z_t be the complex-valued Wiener process, and

\displaystyle X_t = \ln | Z_t - 1 | \, .

The process  X_t is continuous almost surely (since  Z_t does not hit 1, almost surely), and is a local martingale, since the function  u \mapsto \ln|u-1| is harmonic (on the complex plane without the point 1). A localizing sequence may be chosen as  \tau_k = \min \{ t : X_t = -k \}. Nevertheless, the expectation of this process is non-constant; moreover,

\displaystyle \mathbb{E} X_t \to \infty   as  t \to \infty,

which can be deduced from the fact that the mean value of  \ln|u-1| over the circle  |u|=r tends to infinity as  r \to \infty . (In fact, it is equal to  \ln r for r ≥ 1 but to 0 for r ≤ 1).

Martingales via local martingales

Let  M_t be a local martingale. In order to prove that it is a martingale it is sufficient to prove that  M_t^{\tau_k} \to M_t in L1 (as  k \to \infty ) for every t, that is,  \mathbb{E} | M_t^{\tau_k} - M_t | \to 0; here  M_t^{\tau_k} = M_{t\wedge \tau_k} is the stopped process. The given relation  \tau_k \to \infty implies that  M_t^{\tau_k} \to M_t almost surely. The dominated convergence theorem ensures the convergence in L1 provided that

\textstyle (*) \quad \mathbb{E} \sup_k| M_t^{\tau_k} | < \infty    for every t.

Thus, Condition (*) is sufficient for a local martingale  M_t being a martingale. A stronger condition

\textstyle (**) \quad \mathbb{E} \sup_{s\in[0,t]} |M_s| < \infty    for every t

is also sufficient.

Caution. The weaker condition

\textstyle \sup_{s\in[0,t]} \mathbb{E} |M_s| < \infty    for every t

is not sufficient. Moreover, the condition

\textstyle \sup_{t\in[0,\infty)} \mathbb{E} \mathrm{e}^{|M_t|} < \infty

is still not sufficient; for a counterexample see Example 3 above.

A special case:

\textstyle M_t = f(t,W_t),

where  W_t is the Wiener process, and  f : [0,\infty) \times \mathbb{R} \to \mathbb{R} is twice continuously differentiable. The process  M_t is a local martingale if and only if f satisfies the PDE

 \Big( \frac{\partial}{\partial t} + \frac12 \frac{\partial^2}{\partial x^2} \Big) f(t,x) = 0.

However, this PDE itself does not ensure that  M_t is a martingale. In order to apply (**) the following condition on f is sufficient: for every  \varepsilon>0 and t there exists  C = C(\varepsilon,t) such that

\textstyle |f(s,x)| \le C \mathrm{e}^{\varepsilon x^2}

for all  s \in [0,t] and  x \in \mathbb{R}.

Technical details

  1. For the times before 1 it is a martingale since a stopped Brownian motion is. After the instant 1 it is constant. It remains to check it at the instant 1. By the bounded convergence theorem the expectation at 1 is the limit of the expectation at (n-1)/n (as n tends to infinity), and the latter does not depend on n. The same argument applies to the conditional expectation.

References