Author:
Guodong Chen,Xia Zeyang,Sun Rongchuan,Wang Zhenhua,Sun Lining
Abstract
PurposeDetecting objects in images and videos is a difficult task that has challenged the field of computer vision. Most of the algorithms for object detection are sensitive to background clutter and occlusion, and cannot localize the edge of the object. An object's shape is typically the most discriminative cue for its recognition by humans. The purpose of this paper is to introduce a model‐based object detection method which uses only shape‐fragment features.Design/methodology/approachThe object shape model is learned from a small set of training images and all object models are composed of shape fragments. The model of the object is in multi‐scales.FindingsThe major contributions of this paper are the application of learned shape fragments‐based model for object detection in complex environment and a novel two‐stage object detection framework.Originality/valueThe results presented in this paper are competitive with other state‐of‐the‐art object detection methods.
Subject
Electrical and Electronic Engineering,Industrial and Manufacturing Engineering
Reference39 articles.
1. Bai, X., Li, Q., Latecki, L., Liu, W. and Tu, Z. (2009), “Shape band: a deformable object detection approach”, CVPR, pp. 418‐25.
2. Bay, H., Tuytelaars, T. and Van Gool, L. (2006), Surf: Speeded Up Robust Features, Lecture Notes in Computer Science, Vol. 3951, pp. 404‐17.
3. Belongie, S., Malik, J. and Puzicha, J. (2002), “Shape matching and object recognition using shape contexts”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 24 No. 4, pp. 509‐22.
4. Berg, A., Berg, T. and Malik, J. (2005), “Shape matching and object recognition using low distortion correspondences”, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Vol. 1, pp. 26‐31.
5. Bouganis, A. and Shanahan, M. (2008), “Flexible object recognition in cluttered scenes using relative point distribution models”, 19th International Conference on Pattern Recognition, pp. 1‐5.
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献