1. Balcan, M.F., and Weinberger, K.Q. (2016, January 20–22). Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning. Proceedings of the 33rd International Conference on Machine Learning, New York, NY, USA. Available online: https://proceedings.mlr.press/v48/gal16.html.
2. Guyon, I., Von Luxburg, U., and Bengio, S. (2017, January 4–9). What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision. Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA. Available online: https://papers.nips.cc/paper_files/paper/2017/file/2650d6089a6d640c5e85b2b88265dc2b-Paper.pdf.
3. Dropout: A simple way to prevent neural networks from overfitting;Srivastava;J. Mach. Learn. Res.,2014
4. Zhang, C., Sun, S., and Yu, G. (2004, January 3–6). A Bayesian network approach to time series forecasting of short-term traffic flows. Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems, Washington, WA, USA.
5. Chiappa, S., and Calandra, R. (2020, January 26–28). Uncertainty in Neural Networks: Approximately Bayesian Ensembling. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, Online. Available online: http://proceedings.mlr.press/v108/pearce20a/pearce20a.pdf.