Neutral vector

From HandWiki

In statistics, and specifically in the study of the Dirichlet distribution, a neutral vector of random variables is one that exhibits a particular type of statistical independence amongst its elements.[1] In particular, when elements of the random vector must add up to certain sum, then an element in the vector is neutral with respect to the others if the distribution of the vector created by expressing the remaining elements as proportions of their total is independent of the element that was omitted.

Definition

A single element [math]\displaystyle{ X_i }[/math] of a random vector [math]\displaystyle{ X_1,X_2,\ldots,X_k }[/math] is neutral if the relative proportions of all the other elements are independent of [math]\displaystyle{ X_i }[/math].

Formally, consider the vector of random variables

[math]\displaystyle{ X=(X_1,\ldots,X_k) }[/math]

where

[math]\displaystyle{ \sum_{i=1}^k X_i=1. }[/math]

The values [math]\displaystyle{ X_i }[/math] are interpreted as lengths whose sum is unity. In a variety of contexts, it is often desirable to eliminate a proportion, say [math]\displaystyle{ X_1 }[/math], and consider the distribution of the remaining intervals within the remaining length. The first element of [math]\displaystyle{ X }[/math], viz [math]\displaystyle{ X_1 }[/math] is defined as neutral if [math]\displaystyle{ X_1 }[/math] is statistically independent of the vector

[math]\displaystyle{ X^*_1 = \left( \frac{X_2}{1-X_1}, \frac{X_3}{1-X_1}, \ldots, \frac{X_k}{1-X_1} \right). }[/math]

Variable [math]\displaystyle{ X_2 }[/math] is neutral if [math]\displaystyle{ X_2/(1-X_1) }[/math] is independent of the remaining interval: that is, [math]\displaystyle{ X_2/(1-X_1) }[/math] being independent of

[math]\displaystyle{ X^*_{1,2} = \left( \frac{X_3}{1-X_1-X_2}, \frac{X_4}{1-X_1-X_2}, \ldots, \frac{X_k}{1-X_1-X_2} \right). }[/math]

Thus [math]\displaystyle{ X_2 }[/math], viewed as the first element of [math]\displaystyle{ Y = (X_2,X_3,\ldots,X_k) }[/math], is neutral.

In general, variable [math]\displaystyle{ X_j }[/math] is neutral if [math]\displaystyle{ X_1,\ldots X_{j-1} }[/math] is independent of

[math]\displaystyle{ X^*_{1,\ldots,j} = \left( \frac{X_{j+1}}{1-X_1-\cdots -X_j}, \ldots, \frac{X_k}{1-X_1-\cdots - X_j} \right). }[/math]

Complete neutrality

A vector for which each element is neutral is completely neutral.

If [math]\displaystyle{ X = (X_1, \ldots, X_K)\sim\operatorname{Dir}(\alpha) }[/math] is drawn from a Dirichlet distribution, then [math]\displaystyle{ X }[/math] is completely neutral. In 1980, James and Mosimann[2] showed that the Dirichlet distribution is characterised by neutrality.

See also

References

  1. Connor, R. J.; Mosimann, J. E. (1969). "Concepts of Independence for Proportions with a Generalization of the Dirichlet Distribution". Journal of the American Statistical Association 64 (325): 194–206. doi:10.2307/2283728. 
  2. James, Ian R.; Mosimann, James E (1980). "A new characterization of the Dirichlet distribution through neutrality". The Annals of Statistics 8 (1): 183–189. doi:10.1214/aos/1176344900.