Affiliation:
1. University of Western Ontario, London, ON, Canada
2. ABB Corporate Research Center, Ladenburg, Germany
Abstract
Multivariate time-series classification problems are found in many industrial settings; for example, fault detection in a manufacturing process by monitoring sensors signals. It is difficult to obtain large labeled datasets in these settings, for reasons such as limitations in the automatic recording, the need for expert root-cause analysis, and the very limited access to human experts. Therefore, methods that perform classification in a label efficient manner are useful for building and deploying machine learning models in the industrial setting. In this work, we apply a self-supervised learning method called Contrastive Predictive Coding (CPC) to classification tasks on three industrial multivariate time-series datasets. First, the CPC neural network (CPC base) is trained with a large number of unlabeled time-series data instances. Then, a standard supervised classifier such as a multi-layer perception (MLP) is trained on available labeled data using the output embeddings from the pre-trained CPC base. On all three classification datasets, we see increased label efficiency (ability to reach a goal accuracy level with less labeled examples). In the low data regime (10's or few 100's of labeled examples), the CPC pre-trained model achieves high accuracy with up to 15x less labels than a model trained only on labeled data. We also conduct experiments to evaluate the usefulness of CPC pre-trained classifiers as base models to start an active learning loop, and find that uncertainty sampling does not perform significantly better than random sampling during the initial queries.
Publisher
Association for Computing Machinery (ACM)
Reference29 articles.
1. J. Z. Bengar , J. van de Weijer, B. Twardowski, and B. Raducanu. Reducing label effort: Self-supervised meets active learning . In Proceedings of the IEEE/CVF International Conference on Computer Vision , pages 1631 -- 1639 , 2021 . J. Z. Bengar, J. van de Weijer, B. Twardowski, and B. Raducanu. Reducing label effort: Self-supervised meets active learning. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 1631--1639, 2021.
2. On the Marginal Benefit of Active Learning: Does Self-Supervision Eat its Cake?
3. The Machine Learning Life Cycle in Chemical Operations – Status and Open Challenges
4. Active learning strategy for smart soft sensor development under a small number of labeled data samples
5. Y. Geifman and R. El-Yaniv . Deep active learning over the long tail. arXiv preprint arXiv:1711.00941 , 2017 . Y. Geifman and R. El-Yaniv. Deep active learning over the long tail. arXiv preprint arXiv:1711.00941, 2017.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献