Abstract
Material classification is similar to texture classification and consists in predicting the material class of a surface in a color image, such as wood, metal, water, wool, or ceramic. It is very challenging because of the intra-class variability. Indeed, the visual appearance of a material is very sensitive to the acquisition conditions such as viewpoint or lighting conditions. Recent studies show that deep convolutional neural networks (CNNs) clearly outperform hand-crafted features in this context but suffer from a lack of data for training the models. In this paper, we propose two contributions to cope with this problem. First, we provide a new material dataset with a large range of acquisition conditions so that CNNs trained on these data can provide features that can adapt to the diverse appearances of the material samples encountered in real-world. Second, we leverage recent advances in multi-view learning methods to propose an original architecture designed to extract and combine features from several views of a single sample. We show that such multi-view CNNs significantly improve the performance of the classical alternatives for material classification.
Subject
Electrical and Electronic Engineering,Computer Graphics and Computer-Aided Design,Computer Vision and Pattern Recognition,Radiology, Nuclear Medicine and imaging
Reference54 articles.
1. Transfer Learning for Material Classification Based on Material Appearance Correspondances;Xu;Ph.D. Thesis,2021
2. Property-Aware Robot Object Manipulation: A Generative Approach;Garello;Proceedings of the 2021 IEEE International Conference on Development and Learning (ICDL),2021
3. Computer vision for solid waste sorting: A critical review of academic research
4. Beyond White: Ground Truth Colors for Color Constancy Correction;Cheng;Proceedings of the IEEE International Conference on Computer Vision (ICCV),2015
5. The Lottery Ticket Hypothesis for Object Recognition;Girish;Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR),2021
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献