Software:LightGBM
Developer(s) | Microsoft and LightGBM contributors[1] |
---|---|
Initial release | 2016 |
Stable release | v3.3.4[2]
/ December 29, 2022 |
Repository | github |
Written in | C++, Python, R, C |
Operating system | Windows, macOS, Linux |
Type | Machine learning, gradient boosting framework |
License | MIT License |
Website | lightgbm |
LightGBM, short for light gradient-boosting machine, is a free and open-source distributed gradient-boosting framework for machine learning, originally developed by Microsoft.[3][4] It is based on decision tree algorithms and used for ranking, classification and other machine learning tasks. The development focus is on performance and scalability.
Overview
The LightGBM framework supports different algorithms including GBT, GBDT, GBRT, GBM, MART[5][6] and RF.[7] LightGBM has many of XGBoost's advantages, including sparse optimization, parallel training, multiple loss functions, regularization, bagging, and early stopping. A major difference between the two lies in the construction of trees. LightGBM does not grow a tree level-wise — row by row — as most other implementations do.[8] Instead it grows trees leaf-wise. It chooses the leaf it believes will yield the largest decrease in loss.[9] Besides, LightGBM does not use the widely used sorted-based decision tree learning algorithm, which searches the best split point on sorted feature values,[10] as XGBoost or other implementations do. Instead, LightGBM implements a highly optimized histogram-based decision tree learning algorithm, which yields great advantages on both efficiency and memory consumption.[11] The LightGBM algorithm utilizes two novel techniques called Gradient-Based One-Side Sampling (GOSS) and Exclusive Feature Bundling (EFB) which allow the algorithm to run faster while maintaining a high level of accuracy.[12]
LightGBM works on Linux, Windows, and macOS and supports C++, Python,[13] R, and C#.[14] The source code is licensed under MIT License and available on GitHub.[15]
Gradient-based one-side sampling
Gradient-based one-side sampling (GOSS) is a method that leverages the fact that there is no native weight for data instance in GBDT. Since data instances with different gradients play different roles in the computation of information gain, the instances with larger gradients will contribute more to the information gain. So to retain the accuracy of the information, GOSS keeps the instances with large gradients and randomly drops the instances with small gradients.[12]
Exclusive feature bundling
Exclusive feature bundling (EFB) is a near-lossless method to reduce the number of effective features. In a sparse feature space many features are nearly exclusive, implying they rarely take nonzero values simultaneously. One-hot encoded features are a perfect example of exclusive features. EFB bundles these features, reducing dimensionality to improve efficiency while maintaining a high level of accuracy. The bundle of exclusive features into a single feature is called an exclusive feature bundle.[12]
See also
References
- ↑ "microsoft/LightGBM". 7 July 2022. https://github.com/microsoft/LightGBM.
- ↑ "Releases · microsoft/LightGBM". https://github.com/microsoft/LightGBM/releases.
- ↑ Brownlee, Jason (March 31, 2020). "Gradient Boosting with Scikit-Learn, XGBoost, LightGBM, and CatBoost". https://machinelearningmastery.com/gradient-boosting-with-scikit-learn-xgboost-lightgbm-and-catboost/.
- ↑ Kopitar, Leon; Kocbek, Primoz; Cilar, Leona; Sheikh, Aziz; Stiglic, Gregor (July 20, 2020). "Early detection of type 2 diabetes mellitus using machine learning-based prediction models". Scientific Reports 10 (1): 11981. doi:10.1038/s41598-020-68771-z. PMID 32686721. Bibcode: 2020NatSR..1011981K.
- ↑ "Understanding LightGBM Parameters (and How to Tune Them)". May 6, 2020. https://neptune.ai/blog/lightgbm-parameters-guide.
- ↑ "An Overview of LightGBM". May 16, 2018. https://www.avanwyk.com/an-overview-of-lightgbm/.
- ↑ "Parameters — LightGBM 3.0.0.99 documentation". https://lightgbm.readthedocs.io/en/latest/Parameters.html#boosting.
- ↑ The Gradient Boosters IV: LightGBM – Deep & Shallow
- ↑ XGBoost, LightGBM, and Other Kaggle Competition Favorites | by Andre Ye | Sep, 2020 | Towards Data Science
- ↑ Manish, Mehta; Rakesh, Agrawal; Jorma, Rissanen (Nov 24, 2020). "SLIQ: A fast scalable classifier for data mining.". International Conference on Extending Database Technology: 18–32.
- ↑ "Features — LightGBM 3.1.0.99 documentation". https://lightgbm.readthedocs.io/en/latest/Features.html#optimization-in-speed-and-memory-usage.
- ↑ 12.0 12.1 12.2 Ke, Guolin; Meng, Qi; Finley, Thomas; Wang, Taifeng; Chen, Wei; Ma, Weidong; Ye, Qiwei; Liu, Tie-Yan (2017). "LightGBM: A Highly Efficient Gradient Boosting Decision Tree" (in en). Advances in Neural Information Processing Systems 30. https://papers.nips.cc/paper/2017/hash/6449f44a102fde848669bdd9eb6b76fa-Abstract.html.
- ↑ "lightgbm: LightGBM Python Package". 7 July 2022. https://github.com/microsoft/LightGBM.
- ↑ "Microsoft.ML.Trainers.LightGbm Namespace". https://docs.microsoft.com/en-us/dotnet/api/microsoft.ml.trainers.lightgbm.
- ↑ "microsoft/LightGBM". October 6, 2020. https://github.com/microsoft/LightGBM.
Further reading
- Guolin Ke; Qi Meng; Thomas Finely; Taifeng Wang; Wei Chen; Weidong Ma; Qiwei Ye; Tie-Yan Liu (2017). "LightGBM: A Highly Efficient Gradient Boosting Decision Tree". Neural Information Processing System. https://papers.nips.cc/paper/6907-lightgbm-a-highly-efficient-gradient-boosting-decision-tree.pdf.
- Next-Generation Machine Learning with Spark – Covers XGBoost, LightGBM, Spark NLP, Distributed Deep Learning with Keras, and More. Apress. 2020. ISBN 978-1-4842-5668-8. https://www.apress.com/gp/book/9781484256688.
- Machine Learning with LightGBM and Python. Packt Publishing. 2023. ISBN 978-1800564749.
External links
Original source: https://en.wikipedia.org/wiki/LightGBM.
Read more |