Continuous stochastic process

From HandWiki
Short description: Stochastic process that is a continuous function of time or index parameter


In probability theory, a continuous stochastic process is a type of stochastic process that may be said to be "continuous" as a function of its "time" or index parameter. Continuity is a nice property for (the sample paths of) a process to have, since it implies that they are well-behaved in some sense, and, therefore, much easier to analyze. It is implicit here that the index of the stochastic process is a continuous variable. Some authors[1] define a "continuous (stochastic) process" as only requiring that the index variable be continuous, without continuity of sample paths: in another terminology, this would be a continuous-time stochastic process, in parallel to a "discrete-time process". Given the possible confusion, caution is needed.[1]

Definitions

Let (Ω, Σ, P) be a probability space, let T be some interval of time, and let X : T × Ω → S be a stochastic process. For simplicity, the rest of this article will take the state space S to be the real line R, but the definitions go through mutatis mutandis if S is Rn, a normed vector space, or even a general metric space.

Continuity almost surely

Given a time t ∈ T, X is said to be continuous with probability one at t if

[math]\displaystyle{ \mathbf{P} \left( \left\{ \omega \in \Omega \left| \lim_{s \to t} \big| X_{s} (\omega) - X_{t} (\omega) \big| = 0 \right. \right\} \right) = 1. }[/math]

Mean-square continuity

Given a time t ∈ T, X is said to be continuous in mean-square at t if E[|Xt|2] < +∞ and

[math]\displaystyle{ \lim_{s \to t} \mathbf{E} \left[ \big| X_{s} - X_{t} \big|^{2} \right] = 0. }[/math]

Continuity in probability

Given a time t ∈ T, X is said to be continuous in probability at t if, for all ε > 0,

[math]\displaystyle{ \lim_{s \to t} \mathbf{P} \left( \left\{ \omega \in \Omega \left| \big| X_{s} (\omega) - X_{t} (\omega) \big| \geq \varepsilon \right. \right\} \right) = 0. }[/math]

Equivalently, X is continuous in probability at time t if

[math]\displaystyle{ \lim_{s \to t} \mathbf{E} \left[ \frac{\big| X_{s} - X_{t} \big|}{1 + \big| X_{s} - X_{t} \big|} \right] = 0. }[/math]

Continuity in distribution

Given a time t ∈ T, X is said to be continuous in distribution at t if

[math]\displaystyle{ \lim_{s \to t} F_{s} (x) = F_{t} (x) }[/math]

for all points x at which Ft is continuous, where Ft denotes the cumulative distribution function of the random variable Xt.

Sample continuity

X is said to be sample continuous if Xt(ω) is continuous in t for P-almost all ω ∈ Ω. Sample continuity is the appropriate notion of continuity for processes such as Itō diffusions.

Feller continuity

Main page: Feller-continuous process

X is said to be a Feller-continuous process if, for any fixed t ∈ T and any bounded, continuous and Σ-measurable function g : S → R, Ex[g(Xt)] depends continuously upon x. Here x denotes the initial state of the process X, and Ex denotes expectation conditional upon the event that X starts at x.

Relationships

The relationships between the various types of continuity of stochastic processes are akin to the relationships between the various types of convergence of random variables. In particular:

  • continuity with probability one implies continuity in probability;
  • continuity in mean-square implies continuity in probability;
  • continuity with probability one neither implies, nor is implied by, continuity in mean-square;
  • continuity in probability implies, but is not implied by, continuity in distribution.

It is tempting to confuse continuity with probability one with sample continuity. Continuity with probability one at time t means that P(At) = 0, where the event At is given by

[math]\displaystyle{ A_{t} = \left\{ \omega \in \Omega \left| \lim_{s \to t} \big| X_{s} (\omega) - X_{t} (\omega) \big| \neq 0 \right. \right\}, }[/math]

and it is perfectly feasible to check whether or not this holds for each t ∈ T. Sample continuity, on the other hand, requires that P(A) = 0, where

[math]\displaystyle{ A = \bigcup_{t \in T} A_{t}. }[/math]

A is an uncountable union of events, so it may not actually be an event itself, so P(A) may be undefined! Even worse, even if A is an event, P(A) can be strictly positive even if P(At) = 0 for every t ∈ T. This is the case, for example, with the telegraph process.

Notes

  1. 1.0 1.1 Dodge, Y. (2006) The Oxford Dictionary of Statistical Terms, OUP. ISBN:0-19-920613-9 (Entry for "continuous process")

References

  • Kloeden, Peter E.; Platen, Eckhard (1992). Numerical solution of stochastic differential equations. Applications of Mathematics (New York) 23. Berlin: Springer-Verlag. pp. 38–39. ISBN 3-540-54062-8. 
  • Øksendal, Bernt K. (2003). Stochastic Differential Equations: An Introduction with Applications (Sixth ed.). Berlin: Springer. ISBN 3-540-04758-1.  (See Lemma 8.1.4)