Affiliation:
1. Institute of physical culture, Harbin University Harbin, China
2. Department of Physical Education, Harbin Finance University Harbin, China + The Graduate School of Saint Paul University Philippines Ottawa, Philippines
Abstract
Using deep map sequence to recognize human action is an important research field in computer vision. The traditional deep map-based methods have a lot of redundant information. Therefore, this paper proposes a new deep map sequence feature expression method based on discriminative collaborative representation classifier, which highlights the time sequence of human action features. In this paper, the energy field is established according to the shape and action characteristics of human body to obtain the energy information of human body. Then the energy information is projected onto three orthogonal axes to obtain deep spatialtemporal energy map. Meanwhile, in order to solve the problem of high misclassification probability of similar samples by collaborative representation classifier (CRC), a discriminative CRC (DCRC) is proposed. The classifier takes into account the influence of all training samples and each kind of samples on the collaborative representation coefficient, it obtains the highly discriminative collaborative representation coefficient, and improves the discriminability of similar samples. Experimental results on MSR Action3D data set show that the redundancy of key-frame algorithm is reduced, and the operation efficiency of each algorithm is improved by 20%-30%. The proposed algorithm in this paper reduces the redundant information in deep map sequence and improves the extraction rate of feature map. It not only preserves the spatial information of human action through the energy field, but also records the temporal information of human action in a complete way. What?s more, it still maintains a high recognition accuracy in the action data with temporal information.
Publisher
National Library of Serbia
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献