Affiliation:
1. Computer Science and Engineering, Baskin School of Engineering, University of California, Santa Cruz, CA 95064
Abstract
Abstract
An important but insufficiently addressed issue for machine learning in engineering applications is the task of model selection for new problems. Existing approaches to model selection generally focus on optimizing the learning algorithm and associated hyperparameters. However, in real-world engineering applications, the parameters that are external to the learning algorithm, such as feature engineering, can also have a significant impact on the performance of the model. These external parameters do not fit into most existing approaches for model selection and are therefore often studied ad hoc or not at all. In this article, we develop a statistical design of experiment (DOEs) approach to model selection based on the use of the Taguchi method. The key idea is that we use orthogonal arrays to plan a set of build-and-test experiments to study the external parameters in combination with the learning algorithm. The use of orthogonal arrays maximizes the information learned from each experiment and, therefore, enables the experimental space to be explored extremely efficiently in comparison with grid or random search methods. We demonstrated the application of the statistical DOE approach to a real-world model selection problem involving predicting service request escalation. Statistical DOE significantly reduced the number of experiments necessary to fully explore the external parameters for this problem and was able to successfully optimize the model with respect to the objective function of minimizing total cost in addition to the standard evaluation metrics such as accuracy, f-measure, and g-mean.
Subject
Industrial and Manufacturing Engineering,Computer Graphics and Computer-Aided Design,Computer Science Applications,Software
Reference35 articles.
1. Ranking Learning Algorithms: Using Ibl and Meta-Learning on Accuracy and Time Results;Brazdil;Mach. Learn.,2003
2. Nonlinear Regression Model Generation Using Hyperparameter Optimization;Strijov;Comput. Math. Appl.,2010
3. Algorithms for Hyperparameter Optimization;Bergstra,2011
4. Practical Bayesian Optimization of Machine Learning Algorithms;Snoek,2012
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献