A Lightweight Attentional Shift Graph Convolutional Network for Skeleton-Based Action Recognition
-
Published:2023-05-09
Issue:3
Volume:18
Page:
-
ISSN:1841-9844
-
Container-title:INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL
-
language:
-
Short-container-title:INT J COMPUT COMMUN, Int. J. Comput. Commun. Control
Author:
Li Xianshan,Kang Jingwen,Yang Yang,Zhao Fengda
Abstract
In the field of skeleton-based human behavior recognition, graph convolutional neural networks have made remarkable achievements. However, high precision networks are often accompanied by numerous parameters and computational cost, and their application in mobile devices has considerable limitations. Aiming at the problem of excessive spatiotemporal complexity of high-accuracy methods, this paper further analyzes the lightweight human action recognition model and proposes a lightweight architecture attentional shift graph convolutional network. There are three main improvements in this model. Firstly, shift convolution is a lightweight convolution method that can be combined with graph convolution to effectively reduce its complexity. At the same time, a shallow architecture for multi-stream early fusion is designed to reduce the network scale by merging multi-stream networks and reducing the number of network layers. In addition, the efficient channel attention module is introduced into the model to capture the underlying characteristic information in the channel domain. Experiments are conducted on the three existing skeleton datasets, NTU RGB+D, NTU-120 RGB+D, and Northwestern-UCLA. Results demonstrate that the proposed model is not only competitive in accuracy, but also outperforms current mainstream methods in parameter count and computational cost, and supports running in some devices with limited computing and storage resources.
Publisher
Agora University of Oradea
Subject
Computational Theory and Mathematics,Computer Networks and Communications,Computer Science Applications
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献