Affiliation:
1. School of Statistics, KLATASDS-MOE, East China Normal University , Shanghai 200062 , People’s Republic of China
Abstract
Abstract
Neural networks and random forests are popular and promising tools for machine learning. This article explores the proper integration of these two approaches for nonparametric regression to improve the performance of a single approach. Specifically, we propose a neural network estimator with local enhancement provided by random forests. It naturally synthesizes the local relation adaptivity of random forests and the strong global approximation ability of neural networks. Based on the classical empirical risk minimization framework, we establish a nonasymptotic error bound for the estimator. By utilizing advanced U-process theory and an appropriate network structure, we can further improve the convergence rate to the nearly minimax rate. Also with the assistance of random forests, we can implement gradient learning with neural networks. Comprehensive simulation studies and real data applications demonstrate the superiority of our proposal.
Funder
National Key R&D Program of China
National Natural Science Foundation of China
Shanghai Pilot Program for Basic Research
Publisher
Oxford University Press (OUP)
Reference42 articles.
1. Local Rademacher complexities;Bartlett;Annals of Statistics,2005
2. Nearly-tight VC-dimension and pseudodimension bounds for piecewise linear neural networks;Bartlett;Journal of Machine Learning Research,2019
3. On deep learning as a remedy for the curse of dimensionality in nonparametric regression;Bauer;Annals of Statistics,2019
4. Analysis of a random forests model;Biau;Journal of Machine Learning Research,2012