Local martingale

From HandWiki
Short description: Stochastic process with sequence of stopping times so each stopped processes is martingale

In mathematics, a local martingale is a type of stochastic process, satisfying the localized version of the martingale property. Every martingale is a local martingale; every bounded local martingale is a martingale; in particular, every local martingale that is bounded from below is a supermartingale, and every local martingale that is bounded from above is a submartingale; however, in general a local martingale is not a martingale, because its expectation can be distorted by large values of small probability. In particular, a driftless diffusion process is a local martingale, but not necessarily a martingale.

Local martingales are essential in stochastic analysis (see Itō calculus, semimartingale, and Girsanov theorem).

Definition

Let [math]\displaystyle{ (\Omega,F,P) }[/math] be a probability space; let [math]\displaystyle{ F_*=\{F_t\mid t\geq 0\} }[/math] be a filtration of [math]\displaystyle{ F }[/math]; let [math]\displaystyle{ X:[0,\infty)\times \Omega \rightarrow S }[/math] be an [math]\displaystyle{ F_* }[/math]-adapted stochastic process on the set [math]\displaystyle{ S }[/math]. Then [math]\displaystyle{ X }[/math] is called an [math]\displaystyle{ F_* }[/math]-local martingale if there exists a sequence of [math]\displaystyle{ F_* }[/math]-stopping times [math]\displaystyle{ \tau_k : \Omega \to [0,\infty) }[/math] such that

  • the [math]\displaystyle{ \tau_k }[/math] are almost surely increasing: [math]\displaystyle{ P\left\{\tau_k \lt \tau_{k+1} \right\}=1 }[/math];
  • the [math]\displaystyle{ \tau_k }[/math] diverge almost surely: [math]\displaystyle{ P \left\{\lim_{k\to\infty} \tau_k =\infty \right\}=1 }[/math];
  • the stopped process [math]\displaystyle{ X_t^{\tau_k} := X_{\min \{ t, \tau_k \}} }[/math] is an [math]\displaystyle{ F_* }[/math]-martingale for every [math]\displaystyle{ k }[/math].

Examples

Example 1

Let Wt be the Wiener process and T = min{ t : Wt = −1 } the time of first hit of −1. The stopped process Wmin{ tT } is a martingale; its expectation is 0 at all times, nevertheless its limit (as t → ∞) is equal to −1 almost surely (a kind of gambler's ruin). A time change leads to a process

[math]\displaystyle{ \displaystyle X_t = \begin{cases} W_{\min\left(t,T\right)} &\text{for } 0 \le t \lt T,\\ -1 &\text{for } T \le t \lt \infty. \end{cases} }[/math]

The process [math]\displaystyle{ X_t }[/math] is continuous almost surely; nevertheless, its expectation is discontinuous,

[math]\displaystyle{ \displaystyle \operatorname{E} X_t = \begin{cases} 0 &\text{for } 0 \le t \lt T,\\ -1 &\text{for } T \le t \lt \infty. \end{cases} }[/math]

This process is not a martingale. However, it is a local martingale. A localizing sequence may be chosen as [math]\displaystyle{ \tau_k = \min \{ t : X_t = k \} }[/math] if there is such t, otherwise [math]\displaystyle{ \tau_k = k }[/math]. This sequence diverges almost surely, since [math]\displaystyle{ \tau_k = k }[/math] for all k large enough (namely, for all k that exceed the maximal value of the process X). The process stopped at τk is a martingale.[details 1]

Example 2

Let Wt be the Wiener process and ƒ a measurable function such that [math]\displaystyle{ \operatorname{E} |f(W_1)| \lt \infty. }[/math] Then the following process is a martingale:

[math]\displaystyle{ X_t = \operatorname{E} ( f(W_1) \mid F_t ) = \begin{cases} f_{1-t}(W_t) &\text{for } 0 \le t \lt 1,\\ f(W_1) &\text{for } 1 \le t \lt \infty; \end{cases} }[/math]

where

[math]\displaystyle{ f_s(x) = \operatorname{E} f(x+W_s) = \int f(x+y) \frac1{\sqrt{2\pi s}} \mathrm{e}^{-y^2/(2s)} \, dy. }[/math]

The Dirac delta function [math]\displaystyle{ \delta }[/math] (strictly speaking, not a function), being used in place of [math]\displaystyle{ f, }[/math] leads to a process defined informally as [math]\displaystyle{ Y_t = \operatorname{E} ( \delta(W_1) \mid F_t ) }[/math] and formally as

[math]\displaystyle{ Y_t = \begin{cases} \delta_{1-t}(W_t) &\text{for } 0 \le t \lt 1,\\ 0 &\text{for } 1 \le t \lt \infty, \end{cases} }[/math]

where

[math]\displaystyle{ \delta_s(x) = \frac1{\sqrt{2\pi s}} \mathrm{e}^{-x^2/(2s)} . }[/math]

The process [math]\displaystyle{ Y_t }[/math] is continuous almost surely (since [math]\displaystyle{ W_1 \ne 0 }[/math] almost surely), nevertheless, its expectation is discontinuous,

[math]\displaystyle{ \operatorname{E} Y_t = \begin{cases} 1/\sqrt{2\pi} &\text{for } 0 \le t \lt 1,\\ 0 &\text{for } 1 \le t \lt \infty. \end{cases} }[/math]

This process is not a martingale. However, it is a local martingale. A localizing sequence may be chosen as [math]\displaystyle{ \tau_k = \min \{ t : Y_t = k \}. }[/math]

Example 3

Let [math]\displaystyle{ Z_t }[/math] be the complex-valued Wiener process, and

[math]\displaystyle{ X_t = \ln | Z_t - 1 | \, . }[/math]

The process [math]\displaystyle{ X_t }[/math] is continuous almost surely (since [math]\displaystyle{ Z_t }[/math] does not hit 1, almost surely), and is a local martingale, since the function [math]\displaystyle{ u \mapsto \ln|u-1| }[/math] is harmonic (on the complex plane without the point 1). A localizing sequence may be chosen as [math]\displaystyle{ \tau_k = \min \{ t : X_t = -k \}. }[/math] Nevertheless, the expectation of this process is non-constant; moreover,

[math]\displaystyle{ \operatorname{E} X_t \to \infty }[/math]   as [math]\displaystyle{ t \to \infty, }[/math]

which can be deduced from the fact that the mean value of [math]\displaystyle{ \ln|u-1| }[/math] over the circle [math]\displaystyle{ |u|=r }[/math] tends to infinity as [math]\displaystyle{ r \to \infty }[/math]. (In fact, it is equal to [math]\displaystyle{ \ln r }[/math] for r ≥ 1 but to 0 for r ≤ 1).

Martingales via local martingales

Let [math]\displaystyle{ M_t }[/math] be a local martingale. In order to prove that it is a martingale it is sufficient to prove that [math]\displaystyle{ M_t^{\tau_k} \to M_t }[/math] in L1 (as [math]\displaystyle{ k \to \infty }[/math]) for every t, that is, [math]\displaystyle{ \operatorname{E} | M_t^{\tau_k} - M_t | \to 0; }[/math] here [math]\displaystyle{ M_t^{\tau_k} = M_{t\wedge \tau_k} }[/math] is the stopped process. The given relation [math]\displaystyle{ \tau_k \to \infty }[/math] implies that [math]\displaystyle{ M_t^{\tau_k} \to M_t }[/math] almost surely. The dominated convergence theorem ensures the convergence in L1 provided that

[math]\displaystyle{ \textstyle (*) \quad \operatorname{E} \sup_k| M_t^{\tau_k} | \lt \infty }[/math]    for every t.

Thus, Condition (*) is sufficient for a local martingale [math]\displaystyle{ M_t }[/math] being a martingale. A stronger condition

[math]\displaystyle{ \textstyle (**) \quad \operatorname{E} \sup_{s\in[0,t]} |M_s| \lt \infty }[/math]    for every t

is also sufficient.

Caution. The weaker condition

[math]\displaystyle{ \textstyle \sup_{s\in[0,t]} \operatorname{E} |M_s| \lt \infty }[/math]    for every t

is not sufficient. Moreover, the condition

[math]\displaystyle{ \textstyle \sup_{t\in[0,\infty)} \operatorname{E} \mathrm{e}^{|M_t|} \lt \infty }[/math]

is still not sufficient; for a counterexample see Example 3 above.

A special case:

[math]\displaystyle{ \textstyle M_t = f(t,W_t), }[/math]

where [math]\displaystyle{ W_t }[/math] is the Wiener process, and [math]\displaystyle{ f : [0,\infty) \times \mathbb{R} \to \mathbb{R} }[/math] is twice continuously differentiable. The process [math]\displaystyle{ M_t }[/math] is a local martingale if and only if f satisfies the PDE

[math]\displaystyle{ \Big( \frac{\partial}{\partial t} + \frac12 \frac{\partial^2}{\partial x^2} \Big) f(t,x) = 0. }[/math]

However, this PDE itself does not ensure that [math]\displaystyle{ M_t }[/math] is a martingale. In order to apply (**) the following condition on f is sufficient: for every [math]\displaystyle{ \varepsilon\gt 0 }[/math] and t there exists [math]\displaystyle{ C = C(\varepsilon,t) }[/math] such that

[math]\displaystyle{ \textstyle |f(s,x)| \le C \mathrm{e}^{\varepsilon x^2} }[/math]

for all [math]\displaystyle{ s \in [0,t] }[/math] and [math]\displaystyle{ x \in \mathbb{R}. }[/math]

Technical details

  1. For the times before 1 it is a martingale since a stopped Brownian motion is. After the instant 1 it is constant. It remains to check it at the instant 1. By the bounded convergence theorem the expectation at 1 is the limit of the expectation at (n-1)/n (as n tends to infinity), and the latter does not depend on n. The same argument applies to the conditional expectation.[vague]

References

  • Øksendal, Bernt K. (2003). Stochastic Differential Equations: An Introduction with Applications (Sixth ed.). Berlin: Springer. ISBN 3-540-04758-1.