Darmois–Skitovich theorem

From HandWiki
Short description: If 2 linear forms on independent random variables are independent, the variables are normal

In mathematical statistics, the Darmois–Skitovich theorem characterizes the normal distribution (the Gaussian distribution) by the independence of two linear forms from independent random variables. This theorem was proved independently by G. Darmois and V. P. Skitovich in 1953.[1][2]

Formulation

Let [math]\displaystyle{ \xi_j, j = 1, 2, \ldots, n, n \ge 2 }[/math]  be independent random variables. Let [math]\displaystyle{ \alpha_j, \beta_j }[/math]  be nonzero constants. If the linear forms [math]\displaystyle{ L_1 = \alpha_1\xi_1 + \cdots + \alpha_n\xi_n }[/math] and [math]\displaystyle{ L_2 = \beta_1\xi_1 + \cdots + \beta_n\xi_n }[/math] are independent then all random variables [math]\displaystyle{ \xi_j }[/math] have normal distributions (Gaussian distributions).

History

The Darmois–Skitovich theorem is a generalization of the Kac–Bernstein theorem in which the normal distribution (the Gaussian distribution) is characterized by the independence of the sum and the difference of two independent random variables. For a history of proving the theorem by V. P. Skitovich, see the article [3]

References

  1. "Analyse générale des liaisons stochastiques: etude particulière de l'analyse factorielle linéaire". Review of the International Statistical Institute 21 (1/2): 2–8. 1953. doi:10.2307/1401511. 
  2. Skitovich, V. P. (1953). "On a property of the normal distribution". Doklady Akademii Nauk SSSR 89: 217–219. 
  3. "О теорем Дармуа-Скитовича" (in ru). http://www.apmath.spbu.ru/ru/misc/journal/p14-15.pdf. 
  • Kagan, A. M.; Linnik, Yu. V.; Rao, C. R. (1973). Characterization Problems in Mathematical Statistics. New York: Wiley.