Intelligent Gesture Recognition Based on Screen Reflectance Multi-Band Spectral Features
Author:
Lin Peiying1ORCID, Li Chenrui2, Chen Sijie2, Huangfu Jiangtao2, Yuan Wei1ORCID
Affiliation:
1. School of Electrical and Information Engineering, Jiangsu University of Science and Technology, Zhangjiagang 215600, China 2. Laboratory of Applied Research on Electromagnetics, Zhejiang University, Hangzhou 310027, China
Abstract
Human–computer interaction (HCI) with screens through gestures is a pivotal method amidst the digitalization trend. In this work, a gesture recognition method is proposed that combines multi-band spectral features with spatial characteristics of screen-reflected light. Based on the method, a red-green-blue (RGB) three-channel spectral gesture recognition system has been developed, composed of a display screen integrated with narrowband spectral receivers as the hardware setup. During system operation, emitted light from the screen is reflected by gestures and received by the narrowband spectral receivers. These receivers at various locations are tasked with capturing multiple narrowband spectra and converting them into light-intensity series. The availability of multi-narrowband spectral data integrates multidimensional features from frequency and spatial domains, enhancing classification capabilities. Based on the RGB three-channel spectral features, this work formulates an RGB multi-channel convolutional neural network long short-term memory (CNN-LSTM) gesture recognition model. It achieves accuracies of 99.93% in darkness and 99.89% in illuminated conditions. This indicates the system’s capability for stable operation across different lighting conditions and accurate interaction. The intelligent gesture recognition method can be widely applied for interactive purposes on various screens such as computers and mobile phones, facilitating more convenient and precise HCI.
Reference49 articles.
1. Vrana, J., and Singh, R. (2022). Handbook of Nondestructive Evaluation 4.0, Springer International Publishing. 2. Hewett, T., Baecker, R., Card, S., Carey, T., Gasen, J., Mantei, M., Perlman, G., Strong, G., and Verplank, W. (1992). ACM SIGCHI Curricula for Human-Computer Interaction, ACM Press. 3. Mourtzis, D., Angelopoulos, J., and Panopoulos, N. (2023). The future of the human–machine interface (HMI) in society 5.0. Future Internet, 15. 4. Personal augmented reality for information visualization on large interactive displays;Reipschlager;IEEE Trans. Vis. Comput. Graph.,2021 5. Hand movements using keyboard and mouse;Biele;Hum. Mov. Hum.-Comput. Interact.,2022
|
|