Affiliation:
1. National Research University Higher School of Economics; Renaissance Credit Bank
2. Lomonosov Moscow State University
Abstract
Machine learning methods have been successful in various aspects of bank lending. Banks have accumulated huge amounts of data about borrowers over the years of application. On the one hand, this made it possible to predict borrower behavior more accurately, on the other, it gave rise to the problem a problem of data redundancy, which greatly complicates the model development. Methods of feature selection, which allows to improve the quality of models, are apply to solve this problem. Feature selection methods can be divided into three main types: filters, wrappers, and embedded methods. Filters are simple and time-efficient methods that may help discover one-dimensional relations. Wrappers and embedded methods are more effective in feature selection, because they account for multi-dimensional relationships, but these methods are resource-consuming and may fail to process large samples with many features. In this article, the authors propose a combined feature selection scheme (CFSS), in which the first stages of selection use coarse filters, and on the final — wrappers for high-quality selection. This architecture lets us increase the quality of selection and reduce the time necessary to process large multi-dimensional samples, which are used in the development of industrial models. Experiments conducted by authors for four types of bank modelling tasks (survey scoring, behavioral scoring, customer response to cross-selling, and delayed debt collection) have shown that the proposed method better than classical methods containing only filters or only wrappers.
Publisher
Financial University under the Government of the Russian Federation
Subject
Management of Technology and Innovation,Economics, Econometrics and Finance (miscellaneous),Finance,Business, Management and Accounting (miscellaneous)
Reference37 articles.
1. Guyon I., Elisseeff A. An introduction to variable and feature selection. Journal of Machine Learning Research. 2003;3(7–8):1157–1182. DOI: 10.1162/153244303322753616
2. Hamon J. Optimisation combinatoire pour la sélection de variables en régression en grande dimension: Application en génétique animale. Docteur en Informatique Thèse. Lille: Université des Sciences et Technologie de Lille; 2013. 160 p. URL: https://core.ac.uk/download/pdf/51213307.pdf
3. Shen C., Zhang K. Two-stage improved Grey Wolf optimization algorithm for feature selection on highdimensional classification. Complex & Intelligent Systems. 2022;8(4):2769–2789. DOI: 10.1007/s40747–021–00452–4
4. Basak H., Das M., Modak S. RSO: A novel reinforced swarm optimization algorithm for feature selection.arXiv:2107.14199. URL: https://arxiv.org/pdf/2107.14199.pdf
5. Roffo G., Melzi S. Features selection via eigenvector centrality. In: Proc. 5th Int. workshop on new frontiers in mining complex patterns (NFMCP2016). (Riva del Garda, 19 September, 2016). Cham: Springer-Verlag; 2017. (Lecture Notes in Computer Science. Vol. 10312). URL: https://www.researchgate.net/publication/305918391_Feature_Selection_via_Eigenvector_Centrality