Abstract
Abstract
In recent years, deep convolutional neural networks (CNNs) have achieved great success in visual tracking. To learn discriminative representations, most of existing methods utilize information of image region category, namely target or background, and/or of target motion among consecutive frames. Although these methods demonstrated to be effective, they ignore the importance of the ranking relationship among samples, which is able to distinguish one positive sample better than another positive one or not. This is especially crucial for visual tracking because there is only one best target candidate among all positive candidates, which tightly bounds the target. In this paper, we propose to take advantage of the ranking relationship among positive samples to learn more discriminative features so as to distinguish closely similar target candidates. In addition, we also propose to make use of the normalized spatial location information to distinguish spatially neighboring candidates. Extensive experiments on challenging image sequences demonstrate the effectiveness of the proposed algorithm against several state-of-the-art methods.
Publisher
Springer Science and Business Media LLC
Reference57 articles.
1. S Zhang, H Yao, X Sun, S Liu, Robust visual tracking using an effective appearance model based on sparse coding. ACM Trans. Intell. Syst. Technol.3(3), 1–18 (2012).
2. S. Zhang, H. Yao, X. Sun, X. Lu, Sparse coding based visual tracking: review and experimental comparison. Pattern Recogn.46(7), 1772–1788 (2013).
3. S. Zhang, H. Yao, H. Zhou, X. Sun, S. Liu, Robust visual tracking based on online learning sparse representation. Neurocomputing. 100(1), 31–40 (2013).
4. S. Zhang, H. Zhou, H. Yao, Y. Zhang, K. Wang, J. Zhang, Adaptive normal hedge for robust visual tracking. Signal Process.110:, 132–142 (2015).
5. S. Yi, Z. He, X. You, Y. Cheung, Single object tracking via robust combination of particle filter and sparse representation. Signal Process.110:, 178–187 (2015).
https://doi.org/10.1016/j.sigpro.2014.09.020
.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献