Convergence, almost-certain

From HandWiki

almost-sure convergence, convergence with probability one

Convergence of a sequence of random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026020/c0260201.png" /> defined on a certain probability space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026020/c0260202.png" />, to a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026020/c0260203.png" />, defined in the following way: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026020/c0260204.png" /> (or <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026020/c0260205.png" /> <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026020/c0260206.png" />-almost certain) if

<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026020/c0260207.png" />

In mathematical analysis this form of convergence is called almost-everywhere convergence. Convergence in probability follows from almost-certain convergence.


Comments

See also Convergence, types of; Weak convergence of probability measures; Distributions, convergence of.