Affiliation:
1. Shanghai Jiao Tong University Shanghai China
2. University of Illinois at Urbana Champaign Champaign IL USA
3. East China Normal University Shanghai China
Abstract
SummarySince the pioneering work of sliced inverse regression, sufficient dimension reduction has been growing into a mature field in statistics and it has broad applications to regression diagnostics, data visualisation, image processing and machine learning. In this paper, we provide a review of several popular inverse regression methods, including sliced inverse regression (SIR) method and principal hessian directions (PHD) method. In addition, we adopt a conditional characteristic function approach and develop a new class of slicing‐free methods, which are parallel to the classical SIR and PHD, and are named weighted inverse regression ensemble (WIRE) and weighted PHD (WPHD), respectively. Relationship with recently developed martingale difference divergence matrix is also revealed. Numerical studies and a real data example show that the proposed slicing‐free alternatives have superior performance than SIR and PHD.
Funder
National Natural Science Foundation of China
East China Normal University
Reference47 articles.
1. Minimum average deviance estimation for sufficient dimension reduction;Adragni K.P.;J. Stat. Comput. Simul.,2018
2. Detecting independence of random vectors: Generalized distance covariance and Gaussian covariance;Böttcher B.;Modern Stochast.: Theory Appl.,2018
3. A new framework for distance and kernel‐based metrics in high dimensions;Chakraborty S.;Electron. J. Stat.,2021
4. Regression Graphics
5. Dimension reduction for conditional mean in regression;Cook R.D.;Ann. Stat.,2002