Schilder's theorem

From HandWiki

In mathematics, Schilder's theorem is a generalization of the Laplace method from integrals on [math]\displaystyle{ \mathbb{R}^n }[/math] to functional Wiener integration. The theorem is used in the large deviations theory of stochastic processes. Roughly speaking, out of Schilder's theorem one gets an estimate for the probability that a (scaled-down) sample path of Brownian motion will stray far from the mean path (which is constant with value 0). This statement is made precise using rate functions. Schilder's theorem is generalized by the Freidlin–Wentzell theorem for Itō diffusions.

Statement of the theorem

Let C0 = C0([0, T]; Rd) be the Banach space of continuous functions [math]\displaystyle{ f : [0,T] \longrightarrow \mathbf{R}^d }[/math] such that [math]\displaystyle{ f(0)=0 }[/math], equipped with the supremum norm ||·|| and [math]\displaystyle{ C_0^\ast }[/math] be the subspace of absolutely continuous functions whose derivative is in [math]\displaystyle{ L^2 }[/math] (the so-called Cameron-Martin space). Define the rate function

[math]\displaystyle{ I(\omega) = \frac{1}{2} \int_{0}^{T} \| \dot{\omega}(t) \|^{2} \, \mathrm{d} t }[/math]

on [math]\displaystyle{ C_0^\ast }[/math] and let [math]\displaystyle{ F:C_0\to\mathbb{R},G:C_0\to\mathbb{C} }[/math] be two given functions, such that [math]\displaystyle{ S:=I+F }[/math] (the "action") has a unique minimum [math]\displaystyle{ \Omega\in C_0^\ast }[/math].

Then under some differentiability and growth assumptions on [math]\displaystyle{ F,G }[/math] which are detailed in Schilder 1966, one has

[math]\displaystyle{ \lim_{\lambda\to\infty}\frac{\mathbb{E}\left[\exp\left(-\lambda F(\lambda^{-1/2} \omega)\right)G(\lambda^{-1/2} \omega)\right]}{\exp\left(-\lambda S(\Omega)\right)} = G(\Omega)\mathbb{E}\left[\exp\left(-\frac{1}{2}\langle\omega, D(\Omega) \omega\rangle\right)\right] }[/math]

where [math]\displaystyle{ \mathbb{E} }[/math] denotes expectation with respect to the Wiener measure [math]\displaystyle{ \mathbb{P} }[/math] on [math]\displaystyle{ C_0 }[/math] and [math]\displaystyle{ D(\Omega) }[/math] is the Hessian of [math]\displaystyle{ F }[/math] at the minimum [math]\displaystyle{ \Omega }[/math]; [math]\displaystyle{ \langle\omega, D(\Omega) \omega\rangle }[/math] is meant in the sense of an [math]\displaystyle{ L^2([0,T]) }[/math] inner product.

Application to large deviations on the Wiener measure

Let B be a standard Brownian motion in d-dimensional Euclidean space Rd starting at the origin, 0 ∈ Rd; let W denote the law of B, i.e. classical Wiener measure. For ε > 0, let Wε denote the law of the rescaled process εB. Then, on the Banach space C0 = C0([0, T]; Rd) of continuous functions [math]\displaystyle{ f : [0,T] \longrightarrow \mathbf{R}^d }[/math] such that [math]\displaystyle{ f(0)=0 }[/math], equipped with the supremum norm ||·||, the probability measures Wε satisfy the large deviations principle with good rate function I : C0 → R ∪ {+∞} given by

[math]\displaystyle{ I(\omega) = \frac{1}{2} \int_{0}^{T} | \dot{\omega}(t) |^{2} \, \mathrm{d} t }[/math]

if ω is absolutely continuous, and I(ω) = +∞ otherwise. In other words, for every open set G ⊆ C0 and every closed set F ⊆ C0,

[math]\displaystyle{ \limsup_{\varepsilon \downarrow 0} \varepsilon \log \mathbf{W}_{\varepsilon} (F) \leq - \inf_{\omega \in F} I(\omega) }[/math]

and

[math]\displaystyle{ \liminf_{\varepsilon \downarrow 0} \varepsilon \log \mathbf{W}_{\varepsilon} (G) \geq - \inf_{\omega \in G} I(\omega). }[/math]

Example

Taking ε = 1/c2, one can use Schilder's theorem to obtain estimates for the probability that a standard Brownian motion B strays further than c from its starting point over the time interval [0, T], i.e. the probability

[math]\displaystyle{ \mathbf{W} (C_0 \smallsetminus \mathbf{B}_c (0; \| \cdot \|_\infty)) \equiv \mathbf{P} \big[ \| B \|_\infty \gt c \big], }[/math]

as c tends to infinity. Here Bc(0; ||·||) denotes the open ball of radius c about the zero function in C0, taken with respect to the supremum norm. First note that

[math]\displaystyle{ \| B \|_\infty \gt c \iff \sqrt{\varepsilon} B \in A := \left \{ \omega \in C_0 \mid |\omega(t)| \gt 1 \text{ for some } t \in [0, T] \right\}. }[/math]

Since the rate function is continuous on A, Schilder's theorem yields

[math]\displaystyle{ \begin{align} \lim_{c \to \infty} \frac{\log \left (\mathbf{P} \left [ \| B \|_\infty \gt c \right] \right )}{c^2} &= \lim_{\varepsilon \to 0} \varepsilon \log \left (\mathbf{P} \left[ \sqrt{\varepsilon} B \in A \right] \right ) \\[6pt] &= - \inf \left\{ \left. \frac{1}{2} \int_0^T | \dot{\omega}(t) |^2 \, \mathrm{d} t \,\right|\, \omega \in A \right\} \\[6pt] &= - \frac{1}{2} \int_0^T \frac{1}{T^2} \, \mathrm{d} t \\[6pt] &= - \frac{1}{2 T}, \end{align} }[/math]

making use of the fact that the infimum over paths in the collection A is attained for ω(t) = t/T . This result can be heuristically interpreted as saying that, for large c and/or large T

[math]\displaystyle{ \frac{\log \left (\mathbf{P} \left [ \| B \|_\infty \gt c \right] \right )}{c^2} \approx - \frac{1}{2T} \qquad \text{or} \qquad \mathbf{P} \left[ \| B \|_\infty \gt c \right ] \approx \exp \left( - \frac{c^2}{2 T} \right). }[/math]

In fact, the above probability can be estimated more precisely: for B a standard Brownian motion in Rn, and any T, c and ε > 0, we have:

[math]\displaystyle{ \mathbf{P} \left[ \sup_{0 \leq t \leq T} \left| \sqrt{\varepsilon} B_t \right | \geq c \right] \leq 4 n \exp \left( - \frac{c^2}{2 n T \varepsilon} \right). }[/math]

References

  • Dembo, Amir; Zeitouni, Ofer (1998). Large deviations techniques and applications. Applications of Mathematics (New York) 38 (Second ed.). New York: Springer-Verlag. pp. xvi+396. ISBN 0-387-98406-2.  (See theorem 5.2)