Author:
Alsuhli Ghada,Sakellariou Vasilis,Saleh Hani,Al-Qutayri Mahmoud,Mohammad Baker,Stouraitis Thanos
Publisher
Springer Nature Switzerland
Reference22 articles.
1. Lu, J., Fang, C., Xu, M., Lin, J., Wang, Z.: Evaluations on deep neural networks training using posit number system. IEEE Trans. Comput. 70(2), 174–187 (2020)
2. Carmichael, Z., Langroudi, H.F., Khazanov, C., Lillie, J., Gustafson, J.L., Kudithipudi, D.: Deep positron: a deep neural network using the posit number system. In: 2019 Design, Automation & Test in Europe Conference & Exhibition (DATE), pp. 1421–1426. IEEE (2019)
3. Gustafson, J.L., Yonemoto, I.T.: Beating floating point at its own game: posit arithmetic. Supercomput. Front. Innov. 4(2), 71–86 (2017)
4. Cococcioni, M., Rossi, F., Ruffaldi, E., Saponara, S.: A fast approximation of the hyperbolic tangent when using posit numbers and its application to deep neural networks. In: International Conference on Applications in Electronics Pervading Industry, Environment and Society, pp. 213–221. Springer, Berlin (2019)
5. Romanov, A.Y., Stempkovsky, A.L., Lariushkin, I.V., Novoselov, G.E., Solovyev, R.A., Starykh, V.A., Romanova, I.I., Telpukhov, D.V., Mkrtchan, I.A.: Analysis of posit and Bfloat arithmetic of real numbers for machine learning. IEEE Access 9, 82318–82324 (2021)