Abstract
The Reduction by Space Partitioning (RSP3) algorithm is a well-known data reduction technique. It summarizes the training data and generates representative prototypes. Its goal is to reduce the computational cost of an instance-based classifier without penalty in accuracy. The algorithm keeps on dividing the initial training data into subsets until all of them become homogeneous, i.e., they contain instances of the same class. To divide a non-homogeneous subset, the algorithm computes its two furthest instances and assigns all instances to their closest furthest instance. This is a very expensive computational task, since all distances among the instances of a non-homogeneous subset must be calculated. Moreover, noise in the training data leads to a large number of small homogeneous subsets, many of which have only one instance. These instances are probably noise, but the algorithm mistakenly generates prototypes for these subsets. This paper proposes simple and fast variations of RSP3 that avoid the computationally costly partitioning tasks and remove the noisy training instances. The experimental study conducted on sixteen datasets and the corresponding statistical tests show that the proposed variations of the algorithm are much faster and achieve higher reduction rates than the conventional RSP3 without negatively affecting the accuracy.
Reference36 articles.
1. García, S., Luengo, J., and Herrera, F. (2015). Data Preprocessing in Data Mining, Springer. Intelligent Systems Reference Library.
2. Nearest Neighbor Pattern Classification;Cover;IEEE Trans. Inf. Theor.,1967
3. Prototype Selection for Nearest Neighbor Classification: Taxonomy and Empirical Study;Garcia;IEEE Trans. Pattern Anal. Mach. Intell.,2012
4. A Taxonomy and Experimental Study on Prototype Generation for Nearest Neighbor Classification;Triguero;Trans. Syst. Man Cyber Part C,2012
5. High training set size reduction by space partitioning and prototype abstraction;Pattern Recognit.,2004
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献