1. Agarap, A. F. M. (2018). Deep learning using Rectified Linear Units (ReLU). Accessed January 17, 2021, from https://arxiv.org/pdf/1803.08375.pdf
2. Alzubaidi, L., Zhang, J., Humaidi, A. J., Al-Dujaili, A., Duan, Y., Al-Shamma, O., et al. (2021). Review of deep learning: Concepts, CNN architectures, challenges, applications, future directions. Journal of Big Data, 8, 53. https://doi.org/10.1186/s40537-021-00444-8
3. Ananthanarayanan, R., Esser, S. K., Simon, H. D., & Modha, D. S. (2009). The cat is out of the bag: Cortical simulations with 109 neurons, 1013 synapses. Proceedings of the Conference on High-Performance Computing Networking, Storage and Analysis. https://doi.org/10.1145/1654059.1654124
4. Brownlee, J. (2019, January 9; updated 2020, August 20). A gentle introduction to the Rectified Linear Unit (ReLU). Accessed January 17, 2021, from https://machinelearningmastery.com/rectified-linear-activation-function-for-deep-learning-neural-networks/
5. Deep learning. (2022, August 3). In Wikipedia. Accessed January 17, 2022, from https://en.wikipedia.org/wiki/Deep_learning