Affiliation:
1. University of Texas at Austin, Austin, USA
2. University of Texas at Austin; University of Sao Paulo at Sao Carlos, Brazil
Abstract
Unsupervised models can provide supplementary soft constraints to help classify new “target” data because similar instances in the target set are more likely to share the same class label. Such models can also help detect possible differences between training and target distributions, which is useful in applications where concept drift may take place, as in transfer learning settings. This article describes a general optimization framework that takes as input class membership estimates from existing classifiers learned on previously encountered “source” (or training) data, as well as a similarity matrix from a cluster ensemble operating solely on the target (or test) data to be classified, and yields a consensus labeling of the target data. More precisely, the application settings considered are nontransductive semisupervised and transfer learning scenarios where the training data are used only to build an ensemble of classifiers and are subsequently discarded before classifying the target data. The framework admits a wide range of loss functions and classification/clustering methods. It exploits properties of Bregman divergences in conjunction with Legendre duality to yield a principled and scalable approach. A variety of experiments show that the proposed framework can yield results substantially superior to those provided by naïvely applying classifiers learned on the original task to the target data. In addition, we show that the proposed approach, even not being conceptually transductive, can provide better results compared to some popular transductive learning techniques.
Funder
Division of Information and Intelligent Systems
Fundação de Amparo à Pesquisa do Estado de São Paulo
Conselho Nacional de Desenvolvimento Científico e Tecnológico
Publisher
Association for Computing Machinery (ACM)
Reference63 articles.
1. Clustering with Bregman divergences;Banerjee A.;Journal of Machine Learning Research 6,2005
2. M. Belkin P. Niyogi and V. Sindhwani. 2005. On manifold regularization. In AISTAT. M. Belkin P. Niyogi and V. Sindhwani. 2005. On manifold regularization. In AISTAT.
Cited by
15 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献