Abstract
Abstract
In many application settings, labeling data examples is a costly endeavor, while unlabeled examples are abundant and cheap to produce. Labeling examples can be particularly problematic in an online setting, where there can be arbitrarily many examples that arrive at high frequencies. It is also problematic when we need to predict complex values (e.g., multiple real values), a task that has started receiving considerable attention, but mostly in the batch setting. In this paper, we propose a method for online semi-supervised multi-target regression. It is based on incremental trees for multi-target regression and the predictive clustering framework. Furthermore, it utilizes unlabeled examples to improve its predictive performance as compared to using just the labeled examples. We compare the proposed iSOUP-PCT method with supervised tree methods, which do not use unlabeled examples, and to an oracle method, which uses unlabeled examples as though they were labeled. Additionally, we compare the proposed method to the available state-of-the-art methods. The method achieves good predictive performance on account of increased consumption of computational resources as compared to its supervised variant. The proposed method also beats the state-of-the-art in the case of very few labeled examples in terms of performance, while achieving comparable performance when the labeled examples are more common.
Funder
Horizon 2020 Framework Programme
Javna Agencija za Raziskovalno Dejavnost RS
Publisher
Springer Science and Business Media LLC
Subject
Artificial Intelligence,Software
Reference45 articles.
1. Altun, Y., McAllester, D., & Belkin, M. (2006). Maximum margin semi-supervised learning for structured variables. In Advances in neural information processing systems 18 (NIPS 2005), NIPS Foundation, pp. 33–40.
2. Bifet, A., Holmes, G., Kirkby, R., & Pfahringer, B. (2010). MOA: Massive online analysis. Journal of Machine Learning Research, 11(May), 1601–1604.
3. Blockeel, H. (1998). Top-down induction of first-order logical decision trees. Ph.D. thesis, Katholieke Universiteit Leuven, Leuven, Belgium
4. Blockeel, H., & De Raedt, L. (1998). Top-down induction of first-order logical decision trees. Artificial Intelligence, 101(1), 285–297. https://doi.org/10.1016/S0004-3702(98)00034-4.
5. Brefeld, U., & Scheffer, T. (2006). Semi-supervised learning for structured output variables. In Proceedings of the 23rd international conference on machine learning (ICML 2006), ACM, pp. 145–152. https://doi.org/10.1145/1143844.1143863
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献