# Inverted Dirichlet distribution

In statistics, the inverted Dirichlet distribution is a multivariate generalization of the beta prime distribution, and is related to the Dirichlet distribution. It was first described by Tiao and Cuttman in 1965.[1] The distribution has a density function given by

$\displaystyle{ p\left(x_1,\ldots, x_k\right) = \frac{\Gamma\left(\nu_1+\cdots+\nu_{k+1}\right)}{\prod_{j=1}^{k+1}\Gamma\left(\nu_j\right)} x_1^{\nu_1-1}\cdots x_k^{\nu_k-1}\times\left(1+\sum_{i=1}^k x_i\right)^{-\sum_{j=1}^{k+1}\nu_j},\qquad x_i\gt 0. }$

The distribution has applications in statistical regression and arises naturally when considering the multivariate Student distribution. It can be characterized[2] by its mixed moments:

$\displaystyle{ E\left[\prod_{i=1}^kx_i^{q_i}\right] = \frac{\Gamma\left(\nu_{k+1}-\sum_{j=1}^k q_j\right)}{\Gamma\left(\nu_{k+1}\right)}\prod_{j=1}^k\frac{\Gamma\left(\nu_j+q_j\right)}{\Gamma\left(\nu_j\right)} }$

provided that $\displaystyle{ q_j\gt -\nu_j, 1\leqslant j\leqslant k }$ and $\displaystyle{ \nu_{k+1}\gt q_1+\ldots+q_k }$.

The inverted Dirichlet distribution is conjugate to the negative multinomial distribution if a generalized form of odds ratio is used instead of the categories' probabilities- if the negative multinomial parameter vector is given by $\displaystyle{ p }$, by changing parameters of the negative multinomial to $\displaystyle{ x_i = \frac{p_i}{p_0}, i = 1\ldots k }$ where $\displaystyle{ p_0 = 1 - \sum_{i=1}^{k} p_i }$.

T. Bdiri et al. have developed several models that use the inverted Dirichlet distribution to represent and model non-Gaussian data. They have introduced finite [3][4] and infinite [5] mixture models of inverted Dirichlet distributions using the Newton–Raphson technique to estimate the parameters and the Dirichlet process to model infinite mixtures. T. Bdiri et al. have also used the inverted Dirichlet distribution to propose an approach to generate Support Vector Machine kernels [6] basing on Bayesian inference and another approach to establish hierarchical clustering.[7][8]

## References

1. Tiao, George (1965). "The inverted Dirichlet distribution with applications". Journal of the American Statistical Association 60 (311): 793–805. doi:10.1080/01621459.1965.10480828.
2. Ghorbel, M. (2010). "On the inverted Dirichlet distribution". Communications in Statistics - Theory and Methods 39: 21–37. doi:10.1080/03610920802627062.
3. Bdiri, Taoufik; Nizar, Bouguila (2012). "Positive vectors clustering using inverted Dirichlet finite mixture models". Expert Systems with Applications 39 (2): 1869–1882. doi:10.1016/j.eswa.2011.08.063.
4. Bdiri, Taoufik; Bouguila, Nizar (2011). "Learning Inverted Dirichlet Mixtures for Positive Data Clustering". Rough Sets, Fuzzy Sets, Data Mining and Granular Computing. Lecture Notes in Computer Science. 6743. pp. 265–272. doi:10.1007/978-3-642-21881-1_42. ISBN 978-3-642-21880-4.
5. Bdiri, Taoufik; Bouguila, Nizar (2011). "An Infinite Mixture of Inverted Dirichlet Distributions". Neural Information Processing. Lecture Notes in Computer Science. 7063. pp. 71–78. doi:10.1007/978-3-642-24958-7_9. ISBN 978-3-642-24957-0.
6. Bdiri, Taoufik; Nizar, Bouguila (2013). "Bayesian learning of inverted Dirichlet mixtures for SVM kernels generation". Neural Computing and Applications 23 (5): 1443–1458. doi:10.1007/s00521-012-1094-z.
7. Bdiri, Taoufik; Bouguila, Nizar; Ziou, Djemel (2014). "Object clustering and recognition using multi-finite mixtures for semantic classes and hierarchy modeling". Expert Systems with Applications 41 (4): 1218–1235. doi:10.1016/j.eswa.2013.08.005.
8. Bdiri, Taoufik; Bouguila, Nizar; Ziou, Djemel (2013). "Visual Scenes Categorization Using a Flexible Hierarchical Mixture Model Supporting Users Ontology". 2013 IEEE 25th International Conference on Tools with Artificial Intelligence. pp. 262–267. doi:10.1109/ICTAI.2013.48. ISBN 978-1-4799-2972-6.