Exchangeable random variables

From HandWiki
Revision as of 14:55, 6 February 2024 by MainAI5 (talk | contribs) (add)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Concept in statistics

In statistics, an exchangeable sequence of random variables (also sometimes interchangeable)[1] is a sequence X1X2X3, ... (which may be finitely or infinitely long) whose joint probability distribution does not change when the positions in the sequence in which finitely many of them appear are altered. In other words, the joint distribution is invariant to finite permutation. Thus, for example the sequences

[math]\displaystyle{ X_1, X_2, X_3, X_4, X_5, X_6 \quad \text{ and } \quad X_3, X_6, X_1, X_5, X_2, X_4 }[/math]

both have the same joint probability distribution.

It is closely related to the use of independent and identically distributed random variables in statistical models. Exchangeable sequences of random variables arise in cases of simple random sampling.

Definition

Formally, an exchangeable sequence of random variables is a finite or infinite sequence X1X2X3, ... of random variables such that for any finite permutation σ of the indices 1, 2, 3, ..., (the permutation acts on only finitely many indices, with the rest fixed), the joint probability distribution of the permuted sequence

[math]\displaystyle{ X_{\sigma(1)}, X_{\sigma(2)}, X_{\sigma(3)}, \dots }[/math]

is the same as the joint probability distribution of the original sequence.[1][2]

(A sequence E1, E2, E3, ... of events is said to be exchangeable precisely if the sequence of its indicator functions is exchangeable.) The distribution function FX1,...,Xn(x1, ..., xn) of a finite sequence of exchangeable random variables is symmetric in its arguments x1, ..., xn. Olav Kallenberg provided an appropriate definition of exchangeability for continuous-time stochastic processes.[3][4]

History

The concept was introduced by William Ernest Johnson in his 1924 book Logic, Part III: The Logical Foundations of Science.[5] Exchangeability is equivalent to the concept of statistical control introduced by Walter Shewhart also in 1924.[6][7]

Exchangeability and the i.i.d. statistical model

The property of exchangeability is closely related to the use of independent and identically distributed (i.i.d.) random variables in statistical models. A sequence of random variables that are i.i.d, conditional on some underlying distributional form, is exchangeable. This follows directly from the structure of the joint probability distribution generated by the i.i.d. form.

Mixtures of exchangeable sequences (in particular, sequences of i.i.d. variables) are exchangeable. The converse can be established for infinite sequences, through an important representation theorem by Bruno de Finetti (later extended by other probability theorists such as Halmos and Savage). The extended versions of the theorem show that in any infinite sequence of exchangeable random variables, the random variables are conditionally independent and identically-distributed, given the underlying distributional form. This theorem is stated briefly below. (De Finetti's original theorem only showed this to be true for random indicator variables, but this was later extended to encompass all sequences of random variables.) Another way of putting this is that de Finetti's theorem characterizes exchangeable sequences as mixtures of i.i.d. sequences — while an exchangeable sequence need not itself be unconditionally i.i.d., it can be expressed as a mixture of underlying i.i.d. sequences.[1]

This means that infinite sequences of exchangeable random variables can be regarded equivalently as sequences of conditionally i.i.d. random variables, based on some underlying distributional form. (Note that this equivalence does not quite hold for finite exchangeability. However, for finite vectors of random variables there is a close approximation to the i.i.d. model.) An infinite exchangeable sequence is strictly stationary and so a law of large numbers in the form of Birkhoff–Khinchin theorem applies.[4] This means that the underlying distribution can be given an operational interpretation as the limiting empirical distribution of the sequence of values. The close relationship between exchangeable sequences of random variables and the i.i.d. form means that the latter can be justified on the basis of infinite exchangeability. This notion is central to Bruno de Finetti's development of predictive inference and to Bayesian statistics. It can also be shown to be a useful foundational assumption in frequentist statistics and to link the two paradigms.[8]

The representation theorem: This statement is based on the presentation in O'Neill (2009) in references below. Given an infinite sequence of random variables [math]\displaystyle{ \mathbf{X}=(X_1,X_2,X_3,\ldots) }[/math] we define the limiting empirical distribution function [math]\displaystyle{ F_\mathbf{X} }[/math] by

[math]\displaystyle{ F_\mathbf{X}(x) = \lim_{n\to\infty} \frac{1}{n} \sum_{i=1}^n I(X_i \le x). }[/math]

(This is the Cesàro limit of the indicator functions. In cases where the Cesàro limit does not exist this function can actually be defined as the Banach limit of the indicator functions, which is an extension of this limit. This latter limit always exists for sums of indicator functions, so that the empirical distribution is always well-defined.) This means that for any vector of random variables in the sequence we have joint distribution function given by

[math]\displaystyle{ \Pr (X_1 \le x_1,X_2 \le x_2,\ldots,X_n \le x_n) = \int \prod_{i=1}^n F_\mathbf{X}(x_i)\,dP(F_\mathbf{X}). }[/math]

If the distribution function [math]\displaystyle{ F_\mathbf{X} }[/math] is indexed by another parameter [math]\displaystyle{ \theta }[/math] then (with densities appropriately defined) we have

[math]\displaystyle{ p_{X_1,\ldots,X_n}(x_1,\ldots,x_n) = \int \prod_{i=1}^n p_{X_i}(x_i\mid\theta)\,dP(\theta). }[/math]

These equations show the joint distribution or density characterised as a mixture distribution based on the underlying limiting empirical distribution (or a parameter indexing this distribution).

Note that not all finite exchangeable sequences are mixtures of i.i.d. To see this, consider sampling without replacement from a finite set until no elements are left. The resulting sequence is exchangeable, but not a mixture of i.i.d. Indeed, conditioned on all other elements in the sequence, the remaining element is known.

Covariance and correlation

Exchangeable sequences have some basic covariance and correlation properties which mean that they are generally positively correlated. For infinite sequences of exchangeable random variables, the covariance between the random variables is equal to the variance of the mean of the underlying distribution function.[8] For finite exchangeable sequences the covariance is also a fixed value which does not depend on the particular random variables in the sequence. There is a weaker lower bound than for infinite exchangeability and it is possible for negative correlation to exist.

Covariance for exchangeable sequences (infinite): If the sequence [math]\displaystyle{ X_1,X_2,X_3,\ldots }[/math] is exchangeable, then

[math]\displaystyle{ \operatorname{cov} (X_i,X_j) = \operatorname{var} (\operatorname{E}(X_i\mid F_\mathbf{X})) = \operatorname{var} (\operatorname{E}(X_i\mid\theta)) \ge 0 \quad\text{for }i \ne j. }[/math]

Covariance for exchangeable sequences (finite): If [math]\displaystyle{ X_1,X_2,\ldots,X_n }[/math] is exchangeable with [math]\displaystyle{ \sigma^2 = \operatorname{var} (X_i) }[/math], then

[math]\displaystyle{ \operatorname{cov} (X_i,X_j) \ge - \frac{\sigma^2}{n-1} \quad\text{for }i \ne j. }[/math]

The finite sequence result may be proved as follows. Using the fact that the values are exchangeable, we have

[math]\displaystyle{ \begin{align} 0 & \le \operatorname{var}(X_1 + \cdots + X_n) \\ & = \operatorname{var}(X_1) + \cdots + \operatorname{var}(X_n) + \underbrace{\operatorname{cov}(X_1,X_2) + \cdots\quad{}}_\text{all ordered pairs} \\ & = n\sigma^2 + n(n-1)\operatorname{cov}(X_1,X_2). \end{align} }[/math]

We can then solve the inequality for the covariance yielding the stated lower bound. The non-negativity of the covariance for the infinite sequence can then be obtained as a limiting result from this finite sequence result.

Equality of the lower bound for finite sequences is achieved in a simple urn model: An urn contains 1 red marble and n − 1 green marbles, and these are sampled without replacement until the urn is empty. Let Xi = 1 if the red marble is drawn on the i-th trial and 0 otherwise. A finite sequence that achieves the lower covariance bound cannot be extended to a longer exchangeable sequence.[9]

Examples

  • Any convex combination or mixture distribution of iid sequences of random variables is exchangeable. A converse proposition is de Finetti's theorem.[10]
  • Suppose an urn contains [math]\displaystyle{ n }[/math] red and [math]\displaystyle{ m }[/math] blue marbles. Suppose marbles are drawn without replacement until the urn is empty. Let [math]\displaystyle{ X_i }[/math] be the indicator random variable of the event that the [math]\displaystyle{ i }[/math]-th marble drawn is red. Then [math]\displaystyle{ \left\{ X_i \right\}_{i=1, \dots, n+m} }[/math] is an exchangeable sequence. This sequence cannot be extended to any longer exchangeable sequence.
  • Suppose an urn contains [math]\displaystyle{ n }[/math] red and [math]\displaystyle{ m }[/math] blue marbles. Further suppose a marble is drawn from the urn and then replaced, with an extra marble of the same colour. Let [math]\displaystyle{ X_i }[/math] be the indicator random variable of the event that the [math]\displaystyle{ i }[/math]-th marble drawn is red. Then [math]\displaystyle{ \left\{ X_i \right\}_{i\in \N} }[/math] is an exchangeable sequence. This model is called Polya's urn.
  • Let [math]\displaystyle{ (X, Y) }[/math] have a bivariate normal distribution with parameters [math]\displaystyle{ \mu = 0 }[/math], [math]\displaystyle{ \sigma_x = \sigma_y = 1 }[/math] and an arbitrary correlation coefficient [math]\displaystyle{ \rho\in (-1, 1) }[/math]. The random variables [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] are then exchangeable, but independent only if [math]\displaystyle{ \rho=0 }[/math]. The density function is [math]\displaystyle{ p(x, y) = p(y, x) \propto \exp\left[-\frac{1}{2(1-\rho^2)}(x^2+y^2-2\rho xy)\right]. }[/math]

Applications

The von Neumann extractor is a randomness extractor that depends on exchangeability: it gives a method to take an exchangeable sequence of 0s and 1s (Bernoulli trials), with some probability p of 0 and [math]\displaystyle{ q=1-p }[/math] of 1, and produce a (shorter) exchangeable sequence of 0s and 1s with probability 1/2.

Partition the sequence into non-overlapping pairs: if the two elements of the pair are equal (00 or 11), discard it; if the two elements of the pair are unequal (01 or 10), keep the first. This yields a sequence of Bernoulli trials with [math]\displaystyle{ p=1/2, }[/math] as, by exchangeability, the odds of a given pair being 01 or 10 are equal.

Exchangeable random variables arise in the study of U statistics, particularly in the Hoeffding decomposition.[11]

Exchangeability is a key assumption of the distribution-free inference method of conformal prediction.

See also

Notes

  1. 1.0 1.1 1.2 In short, the order of the sequence of random variables does not affect its joint probability distribution.
    • Chow, Yuan Shih and Teicher, Henry, Probability theory. Independence, interchangeability, martingales, Springer Texts in Statistics, 3rd ed., Springer, New York, 1997. xxii+488 pp. ISBN:0-387-98228-0
  2. Aldous, David J., Exchangeability and related topics, in: École d'Été de Probabilités de Saint-Flour XIII — 1983, Lecture Notes in Math. 1117, pp. 1–198, Springer, Berlin, 1985. ISBN:978-3-540-15203-3 doi:10.1007/BFb0099421
  3. Diaconis, Persi (2009). "Book review: Probabilistic symmetries and invariance principles (Olav Kallenberg, Springer, New York, 2005)". Bulletin of the American Mathematical Society. New Series 46 (4): 691–696. doi:10.1090/S0273-0979-09-01262-2. 
  4. 4.0 4.1 Kallenberg, O., Probabilistic symmetries and invariance principles. Springer-Verlag, New York (2005). 510 pp. ISBN:0-387-25115-4.
  5. Zabell (1992)
  6. Barlow & Irony (1992)
  7. Bergman (2009)
  8. 8.0 8.1
    • O'Neill, B. (2009) Exchangeability, Correlation and Bayes' Effect. International Statistical Review 77(2), pp. 241–250.
  9. Taylor, Robert Lee; Daffer, Peter Z.; Patterson, Ronald F. (1985). Limit theorems for sums of exchangeable random variables. Rowman and Allanheld. pp. 1–152. ISBN 9780847674350. https://books.google.com/books?id=6RaoAAAAIAAJ. 
  10. Spizzichino, Fabio Subjective probability models for lifetimes. Monographs on Statistics and Applied Probability, 91. Chapman & Hall/CRC, Boca Raton, FL, 2001. xx+248 pp. ISBN:1-58488-060-0
  11. Borovskikh, Yu. V. (1996). "Chapter 10 Dependent variables". U-statistics in Banach spaces. Utrecht: VSP. pp. 365–376. ISBN 90-6764-200-2. 

Bibliography

  • Aldous, David J., Exchangeability and related topics, in: École d'Été de Probabilités de Saint-Flour XIII — 1983, Lecture Notes in Math. 1117, pp. 1–198, Springer, Berlin, 1985. ISBN:978-3-540-15203-3 doi:10.1007/BFb0099421
  • Barlow, R. E. & Irony, T. Z. (1992) "Foundations of statistical quality control" in Ghosh, M. & Pathak, P.K. (eds.) Current Issues in Statistical Inference: Essays in Honor of D. Basu, Hayward, CA: Institute of Mathematical Statistics, 99-112.
  • Bergman, B. (2009) "Conceptualistic Pragmatism: A framework for Bayesian analysis?", IIE Transactions, 41, 86–93
  • Borovskikh, Yu. V. (1996). U-statistics in Banach spaces. Utrecht: VSP. pp. xii+420. ISBN 90-6764-200-2. 
  • Chow, Yuan Shih and Teicher, Henry, Probability theory. Independence, interchangeability, martingales, Springer Texts in Statistics, 3rd ed., Springer, New York, 1997. xxii+488 pp. ISBN:0-387-98228-0
  • Diaconis, Persi (2009). "Book review: Probabilistic symmetries and invariance principles (Olav Kallenberg, Springer, New York, 2005)". Bulletin of the American Mathematical Society. New Series 46 (4): 691–696. doi:10.1090/S0273-0979-09-01262-2. 
  • Kallenberg, O., Probabilistic symmetries and invariance principles. Springer-Verlag, New York (2005). 510 pp. ISBN:0-387-25115-4.
  • Kingman, J. F. C., Uses of exchangeability, Ann. Probability 6 (1978) 83–197 MR494344 JSTOR 2243211
  • O'Neill, B. (2009) Exchangeability, Correlation and Bayes' Effect. International Statistical Review 77(2), pp. 241–250. ISBN:978-3-540-15203-3 doi:10.1111/j.1751-5823.2008.00059.x
  • Taylor, Robert Lee; Daffer, Peter Z.; Patterson, Ronald F. (1985). Limit theorems for sums of exchangeable random variables. Rowman and Allanheld. pp. 1–152. ISBN 9780847674350. https://books.google.com/books?id=6RaoAAAAIAAJ. 
  • Zabell, S. L. (1988) "Symmetry and its discontents", in Skyrms, B. & Harper, W. L. Causation, Chance and Credence, pp155-190, Kluwer
  • Zabell, S. L. (1992). "Predicting the unpredictable". Synthese 90 (2): 205. doi:10.1007/bf00485351.