1. C. Tianqi and G. Carlos, “XGBoost: A scalable tree boosting system,” in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2016).
2. C. Xi and I. Hemant, “Random forests for genomic data analysis,” Genomics 99 (6), 323–329 (2012).
3. Y. S. Esen, J. Wilson, and P. D. Gader, “Twenty years of mixture of experts,” IEEE Trans. Neural Networks Learn. Syst. 23 (8), 1177–1193 (2012).
4. C. E. Rasmussen and Z. Ghahramani, “Infinite mixtures of Gaussian process experts,” in NIPS'01: Proceedings of the 14th International Conference on Neural Information Processing Systems: Natural and Synthetic (2002), pp. 881–888.
5. N. Shazeer, A. Mirhoseini, and K. Maziarz, “Outrageously large neural networks: The sparsely-gated mixture-of-experts layer,” International Conference on Learning Representations (2017).