LogitBoost
From HandWiki
In machine learning and computational learning theory, LogitBoost is a boosting algorithm formulated by Jerome Friedman, Trevor Hastie, and Robert Tibshirani.
The original paper casts the AdaBoost algorithm into a statistical framework.[1] Specifically, if one considers AdaBoost as a generalized additive model and then applies the cost function of logistic regression, one can derive the LogitBoost algorithm.[2]
Minimizing the LogitBoost cost function
LogitBoost can be seen as a convex optimization. Specifically, given that we seek an additive model of the form
- [math]\displaystyle{ f = \sum_t \alpha_t h_t }[/math]
the LogitBoost algorithm minimizes the logistic loss:
- [math]\displaystyle{ \sum_i \log\left( 1 + e^{-y_i f(x_i)}\right) }[/math]
See also
References
- ↑ Friedman, Jerome; Hastie, Trevor; Tibshirani, Robert (2000). "Additive logistic regression: a statistical view of boosting". Annals of Statistics 28 (2): 337–407. doi:10.1214/aos/1016218223.
- ↑ "Machine Learning Algorithms for Beginners" (in en-US). https://www.prodigitalweb.com/machine-learning-algorithms-for-beginners/.
Original source: https://en.wikipedia.org/wiki/LogitBoost.
Read more |