Publisher
Springer Nature Switzerland
Reference12 articles.
1. Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: Proceedings of the 25th International Conference on Neural Information Processing Systems, vol. 2, pp. 2951–2959. Curran Associates Inc., Red Hook, New York (2012)
2. Erten, G.E., Keser, S.B., Yavuz, M.: Grid search optimised artificial neural network for open stope stability prediction. Int. J. Min. Reclam. Environ. 35(8), 600–617 (2021). https://doi.org/10.1080/17480930.2021.1899404
3. Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13(10), 281–305 (2012)
4. Young, S.R., Rose, D.C., Karnowski, T.P., Lim, S.-H., Patton, R.M.: Optimizing deep learning hyper-parameters through an evolutionary algorithm. In: Proceedings of the Workshop on Machine Learning in High-Performance Computing Environments, p. 5. Association for Computing Machinery, New York (2015). https://doi.org/10.1145/2834892.2834896
5. Zhang, R., Qiu, Z.: Optimizing hyper-parameters of neural networks with swarm intelligence: a novel framework for credit scoring. PLoS ONE 15(6), 35 (2020). https://doi.org/10.1371/journal.pone.0234254