Author:
Coleman Cody,Chou Edward,Katz-Samuels Julian,Culatana Sean,Bailis Peter,Berg Alexander C.,Nowak Robert,Sumbaly Roshan,Zaharia Matei,Yalniz I. Zeki
Abstract
Many active learning and search approaches are intractable for large-scale industrial settings with billions of unlabeled examples. Existing approaches search globally for the optimal examples to label, scaling linearly or even quadratically with the unlabeled data. In this paper, we improve the computational efficiency of active learning and search methods by restricting the candidate pool for labeling to the nearest neighbors of the currently labeled set instead of scanning over all of the unlabeled data. We evaluate several selection strategies in this setting on three large-scale computer vision datasets: ImageNet, OpenImages, and a de-identified and aggregated dataset of 10 billion publicly shared images provided by a large internet company. Our approach achieved similar mAP and recall as the traditional global approach while reducing the computational cost of selection by up to three orders of magnitude, enabling web-scale active learning.
Publisher
Association for the Advancement of Artificial Intelligence (AAAI)
Cited by
8 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. A Fast Similarity Matrix Calibration Method with Incomplete Query;Proceedings of the ACM Web Conference 2024;2024-05-13
2. Active Batch Sampling for Multi-label Classification with Binary User Feedback;2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV);2024-01-03
3. SeeSaw: Interactive Ad-hoc Search Over Image Databases;Proceedings of the ACM on Management of Data;2023-12-08
4. Active Learning for Open-Set Annotation Using Contrastive Query Strategy;Neural Information Processing;2023-11-14
5. Agile Modeling: From Concept to Classifier in Minutes;2023 IEEE/CVF International Conference on Computer Vision (ICCV);2023-10-01