1. Fogg, A.: Anthony Goldbloom gives you the secret to winning Kaggle competitions. Blogpost (2016). https://www.import.io/post/how-to-win-a-kaggle-competition/
2. Freund, Y.: An adaptive version of the boost by majority algorithm. Mach. Learn. 43(3), 293–318 (2001)
3. Schapire, R.E.: The boosting approach to machine learning: an overview. In: Nonlinear Estimation and Classification, pp. 149–171. Springer, New York, NY (2003)
4. Brophy, J., Lowd, D.: Instance-based uncertainty estimation for gradient-boosted regression trees. In: Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS 2022) (2022)
5. Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., Liu, T.-Y.: LightGBM: a highly efficient gradient boosting decision tree. In: Proceedings of the 31st International Conference on Neural Information Processing Systems, NIPS’17, (Red Hook, NY, USA), pp. 3149–3157. Curran Associates Inc. (2017)