Affiliation:
1. Department of Computer Science, King Saud University, Riyadh 11543, Saudi Arabia
2. Department of Electrical Engineering, King Saud University, Riyadh 11421, Saudi Arabia
Abstract
Naïve Bayes (NB) classification performance degrades if the conditional independence assumption is not satisfied or if the conditional probability estimate is not realistic due to the attributes of correlation and scarce data, respectively. Many works address these two problems, but few works tackle them simultaneously. Existing methods heuristically employ information theory or applied gradient optimization to enhance NB classification performance, however, to the best of our knowledge, the enhanced model generalization capability deteriorated especially on scant data. In this work, we propose a fine-grained boosting of the NB classifier to identify hidden and potential discriminative attribute values that lead the NB model to underfit or overfit on the training data and to enhance their predictive power. We employ the complement harmonic average of the conditional probability terms to measure their distribution divergence and impact on the classification performance for each attribute value. The proposed method is subtle yet significant enough in capturing the attribute values’ inter-correlation (between classes) and intra-correlation (within the class) and elegantly and effectively measuring their impact on the model’s performance. We compare our proposed complement-class harmonized Naïve Bayes classifier (CHNB) with the state-of-the-art Naive Bayes and imbalanced ensemble boosting methods on general and imbalanced machine-learning benchmark datasets, respectively. The empirical results demonstrate that CHNB significantly outperforms the compared methods.
Subject
Fluid Flow and Transfer Processes,Computer Science Applications,Process Chemistry and Technology,General Engineering,Instrumentation,General Materials Science
Reference58 articles.
1. A noise tolerant fine tuning algorithm for the Naïve Bayesian learning algorithm;J. King Saud Univ. Comput. Inf. Sci.,2014
2. Multinomial naïve Bayesian classifier with generalized Dirichlet priors for high-dimensional imbalanced data;Wong;Knowl.-Based Syst.,2021
3. A Regularized Attribute Weighting Framework for Naive Bayes;Wang;IEEE Access,2020
4. Alenazi, F.S., El Hindi, K., and AsSadhan, B. (August, January 29). Complement Class Fine-Tuning of Naïve Bayes for Severely Imbalanced Datasets. Proceedings of the 15th International Conference on Data Science (ICDATA’19), Las Vegas, NV, USA.
5. SMOTE: Synthetic Minority Over-sampling Technique;Chawla;J. Artif. Intell. Res.,2002
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献