Affiliation:
1. IBM India Research Laboratory, Indian Institute of Technology, New Delhi 110016, India
Abstract
In general, pattern classification algorithms assume that all the features are available during the construction of a classifier and its subsequent use. In many practical situations, data are recorded in different servers that are geographically apart, and each server observes features of local interest. The underlying infrastructure and other logistics (such as access control) in many cases do not permit continual synchronization. Each server thus has a partial view of the data in the sense that feature subsets (not necessarily disjoint) are available at each server. In this article, we present a classification algorithm for this distributed vertically partitioned data. We assume that local classifiers can be constructed based on the local partial views of the data available at each server. These local classifiers can be any one of the many standard classifiers (e.g., neuralnetworks, decision tree, k nearest neighbor). Often these local classifiers are constructed to support decision making at each location, and our focus is not on these individual local classifiers. Rather, our focus is constructing a classifier that can use these local classifiers to achieve an error rate that is as close as possible to that of a classifier having access to the entire feature set. We empirically demonstrate the efficacy of the proposed algorithm and also provide theoretical results quantifying the loss that results as compared to the situation where the entire feature set is available to any single classifier.
Subject
Cognitive Neuroscience,Arts and Humanities (miscellaneous)
Cited by
16 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献