Author:
He Zhiquan,Zhang Lujun,Wang Hengyou
Abstract
Human motion prediction is one of the fundamental studies of computer vision. Much work based on deep learning has shown impressive performance for it in recent years. However, long-term prediction and human skeletal deformation are still challenging tasks for human motion prediction. For accurate prediction, this paper proposes a GCN-based two-stage prediction method. We train a prediction model in the first stage. Using multiple cascaded spatial attention graph convolution layers (SAGCL) to extract features, the prediction model generates an initial motion sequence of future actions based on the observed pose. Since the initial pose generated in the first stage often deviates from natural human body motion, such as a motion sequence in which the length of a bone is changed. So the task of the second stage is to fine-tune the predicted pose and make it closer to natural motion. We present a fine-tuning model including multiple cascaded causally temporal-graph convolution layers (CT-GCL). We apply the spatial coordinate error of joints and bone length error as loss functions to train the fine-tuning model. We validate our model on Human3.6m and CMU-MoCap datasets. Extensive experiments show that the two-stage prediction method outperforms state-of-the-art methods. The limitations of proposed methods are discussed as well, hoping to make a breakthrough in future exploration.
Subject
Cellular and Molecular Neuroscience,Neuroscience (miscellaneous)
Reference48 articles.
1. “Structured prediction helps 3D human motion modelling,”;Aksan,2019
2. Deepcppred: a deep learning framework for the discrimination of cell-penetrating peptides and their uptake efficiencies;Arif;IEEE/ACM Trans. Comput. Biol. Bioinform.,2021
3. “Deep representation learning for human motion prediction and classification,”;Butepage,2017
4. Scene recognition with prototype-agnostic scene layout;Chen;IEEE Trans. Image Process.,2020
5. “Action-agnostic human pose forecasting,”;Chiu,2019
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献