Continuity in probability
In probability theory, a stochastic process is said to be continuous in probability or stochastically continuous if its distributions converge whenever the values in the index set converge. [1][2]
Definition
Let [math]\displaystyle{ X=(X_t)_{t \in T} }[/math] be a stochastic process in [math]\displaystyle{ \R^n }[/math]. The process [math]\displaystyle{ X }[/math] is continuous in probability when [math]\displaystyle{ X_r }[/math] converges in probability to [math]\displaystyle{ X_s }[/math] whenever [math]\displaystyle{ r }[/math] converges to [math]\displaystyle{ s }[/math].[2]
Examples and Applications
Feller processes are continuous in probability at [math]\displaystyle{ t=0 }[/math]. Continuity in probability is a sometimes used as one of the defining property for Lévy process.[1] Any process that is continuous in probability and has independent increments has a version that is càdlàg.[2] As a result, some authors immediately define Lévy process as being càdlàg and having independent increments.[3]
References
- ↑ 1.0 1.1 Applebaum, D.. "Lectures on Lévy processes and Stochastic calculus, Braunschweig; Lecture 2: Lévy processes". University of Sheffield. pp. 37–53. http://www.applebaum.staff.shef.ac.uk/Brauns2notes.pdf.
- ↑ 2.0 2.1 2.2 Kallenberg, Olav (2002). Foundations of Modern Probability (2nd ed.). New York: Springer. pp. 286.
- ↑ Kallenberg, Olav (2002). Foundations of Modern Probability (2nd ed.). New York: Springer. pp. 290.
Original source: https://en.wikipedia.org/wiki/Continuity in probability.
Read more |