Information source (mathematics)
From HandWiki
Revision as of 17:14, 16 November 2021 by imported>Scavis (correction)
In mathematics, an information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution. The uncertainty, or entropy rate, of an information source is defined as
- [math]\displaystyle{ H\{\mathbf{X}\} = \lim_{n\to\infty} H(X_n | X_0, X_1, \dots, X_{n-1}) }[/math]
where
- [math]\displaystyle{ X_0, X_1, \dots, X_n }[/math]
is the sequence of random variables defining the information source, and
- [math]\displaystyle{ H(X_n | X_0, X_1, \dots, X_{n-1}) }[/math]
is the conditional information entropy of the sequence of random variables. Equivalently, one has
- [math]\displaystyle{ H\{\mathbf{X}\} = \lim_{n\to\infty} \frac{H(X_0, X_1, \dots, X_{n-1}, X_n)}{n+1}. }[/math]
See also
References
- Robert B. Ash, Information Theory, (1965) Dover Publications. ISBN:0-486-66521-6
zh-yue:資訊源
Original source: https://en.wikipedia.org/wiki/Information source (mathematics).
Read more |