Affiliation:
1. Institute of Information Science and Technology, Donghua University, Shanghai 201620, China
Abstract
As human actions can be characterized by the trajectories of skeleton joints, skeleton-based action recognition techniques have gained increasing attention in the field of intelligent recognition and behavior analysis. With the emergence of large datasets, graph convolutional network (GCN) approaches have been widely applied for skeleton-based action recognition and have achieved remarkable performances. In this paper, a novel GCN-based approach is proposed by introducing a convolutional block attention module (CBAM)-based graph attention block to compute the semantic correlations between any two vertices. By considering semantic correlations, our model can effectively identify the most discriminative vertex connections associated with specific actions, even when the two vertices are physically unconnected. Experimental results demonstrate that the proposed model is effective and outperforms existing methods.
Funder
National Natural Science Foundation of China
Natural Science Foundation of Shanghai
Subject
Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering
Reference46 articles.
1. Zhang, H.B., Zhang, Y.X., Zhong, B., Lei, Q., Yang, L., Du, J.X., and Chen, D.S. (2019). A comprehensive survey of vision-based human action recognition methods. Sensors, 19.
2. Action recongnition in video survillance using hipi and map reducing model;Ushapreethi;Int. J. Mech. Eng. Technol.,2017
3. Ren, B., Liu, M., Ding, R., and Liu, H. (2020). A survey on 3d skeleton-based action recognition using learning method. arXiv.
4. A review of 3D reconstruction techniques in civil engineering and their applications;Ma;Adv. Eng. Inform.,2018
5. Depth estimation using a self-supervised network based on cross-layer feature fusion and the quadtree constraint;Tian;IEEE Trans. Circuits Syst. Video Technol.,2021
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献