Author:
Niu Kai,Zhang Fusang,Fu Xiaolai,Jin Beihong
Abstract
AbstractThis paper presents AcousticPAD, a contactless and robust handwriting recognition system that extends the input and interactions beyond the touchscreen using acoustic signals, thus very useful under the impact of the COVID-19 epidemic. To achieve this, we carefully exploit acoustic pulse signals with high accuracy of time of fight (ToF) measurements. Then we employ trilateration localization method to capture the trajectory of handwriting in air. After that, we incorporate a data augmentation module to enhance the handwriting recognition performance. Finally, we customize a back propagation neural network that leverages augmented image dataset to train a model and recognize the acoustic system generated handwriting characters. We implement AcousticPAD prototype using cheap commodity acoustic sensors, and conduct extensive real environment experiments to evaluate its performance. The results validate the robustness of AcousticPAD, and show that it supports 10 digits and 26 English letters recognition at high accuracies.
Publisher
Springer International Publishing
Reference21 articles.
1. Polancos, R.V., Ruiz, J.M.B., Subang, E.A.I.: User experience study on touchscreen technology: a case study on automated payment machines. In: 2020 IEEE 7th International Conference on Industrial Engineering and Applications (ICIEA), pp. 710–714 (2020)
2. Yi, C., Yang, Q., Scoglio, C.: Understanding the effects of the direct contacts and the indirect contacts on the epidemic spreading among beef cattle farms in southwest kansas. BioRxiv (2020)
3. Wang, Y., Ren, A., Zhou, M., Wang, W., Yang, X.: A novel detection and recognition method for continuous hand gesture using FMCW radar. IEEE Access 8, 167264–167275 (2020)
4. Zhang, Z., Tian, Z., Zhou, M.: Latern: Dynamic continuous hand gesture recognition using FMCW radar sensor. IEEE Sens. J. 18(8), 3278–3289 (2018)
5. Jiang, F., Zhang, S., Wu, S., Gao, Y., Zhao, D.: Multi-layered gesture recognition with Kinect. J. Mach. Learn. Res. 16(1), 227–254 (2015)