Author:
Tamrakar Preeti,Syed Ibrahim S. P.
Abstract
One of the algorithms, which prudently denote better outcomes than the traditional associative classification systems, is the Lazy learning associative classification (LLAC), where the processing of training data is delayed until a test instance is received, whereas in eager learning, before receiving queries, the system begins to process training data. Traditional method assumes that all items within a transaction is same, which is not always true. This paper recommends a new framework called lazy learning associative classification with WkNN (LLAC_WkNN) which uses weighted kNN method with LLAC, that gives a subset of rules when LLAC is applied to the dataset. In order to predict the class label of the unseen test case, the weighted kNN (WkNN) algorithm is then applied to this generated subset. This creates the enhanced accuracy of the classifier. The WkNN also gives an outlier more weight. By applying Dual Distance Weight to LLAC named as LLAC_DWkNN, this limitation of WkNN is resolved. LLAC_DWkNN gives less weightage to outliers, which improve the accuracy of the classifier, further. This algorithm has been applied to different datasets and the experiment results demonstrate that the proposed method is efficient as compared to the traditional and other existing systems.
Reference19 articles.
1. Liu B., Hsu W. and Yiming M., “Integrating Classification and Association Rule Mining,” in 4th International Conference on Knowledge Discovery Data Mining (KDD), (1998).
2. Veloso A., Meira W.. and Zaki M. J., “Lazy Associative Classification,” in Sixth International Conference on Data Mining (ICDM’06), Hong Kong, China, (2006).
3. Integrating associative rule-based classification with Naïve Bayes for text classification
4. Tamrakar P., Roy S. S., Satapathy B. and Syed I. S. P., “Integration of lazy learning associative classification with kNN algorithm,” in Vision Towards Emerging Trends in Communication and Networking (ViTECoN), Vellore, (2019).
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献