Deep nonlinear sufficient dimension reduction
From MaRDI portal
Publication:6608686
DOI10.1214/24-aos2390MaRDI QIDQ6608686
Zhou Yu, Yu Ling Jiao, Rui Qiu, Yinfeng Chen
Publication date: 20 September 2024
Published in: The Annals of Statistics (Search for Journal in Brave)
sufficient dimension reductionU-processdeep neural networksgeneralized martingale difference divergence
Nonparametric regression and quantile regression (62G08) Estimation in multivariate analysis (62H12) Prediction theory (aspects of stochastic processes) (60G25) Neural nets and related approaches to inference from stochastic processes (62M45)
Cites Work
- Unnamed Item
- Unnamed Item
- Measuring and testing dependence by correlation of distances
- Efficient estimation in sufficient dimension reduction
- Principal support vector machines for linear and nonlinear sufficient dimension reduction
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Successive direction extraction for estimating the central subspace in a multiple-index regres\-sion
- Dimension reduction for nonelliptically distributed predictors
- An RKHS formulation of the inverse regression dimension-reduction problem
- Regression analysis under link violation
- Asymptotics for kernel estimate of sliced inverse regression
- Optimal global rates of convergence for nonparametric regression
- Weak convergence and empirical processes. With applications to statistics
- A general theory for nonlinear sufficient dimension reduction: formulation and estimation
- Nonlinear sufficient dimension reduction for functional data
- Fusing sufficient dimension reduction with neural networks
- Conditional variance estimator for sufficient dimension reduction
- Nonparametric regression using deep neural networks with ReLU activation function
- A kernel-based measure for conditional mean dependence
- Detecting independence of random vectors: generalized distance covariance and Gaussian covariance
- Local Rademacher complexities
- Contour regression: a general approach to dimension reduction
- Estimating the Structural Dimension of Regressions Via Parametric Inverse Regression
- Reducing the Dimensionality of Data with Neural Networks
- On Directional Regression for Dimension Reduction
- Principal Hessian Directions Revisited
- Determining the Dimension in Sliced Inverse Regression and Related Methods
- Sliced Inverse Regression for Dimension Reduction
- Determining the Dimensionality in Sliced Inverse Regression
- High-Dimensional Statistics
- An Adaptive Estimation of Dimension Reduction Space
- Martingale Difference Divergence Matrix and Its Application to Dimension Reduction for Stationary Multivariate Time Series
- Martingale Difference Correlation and Its Use in High-Dimensional Variable Screening
- A new class of measures for testing independence
- Dimension Reduction Forests: Local Variable Importance Using Structured Random Forests
- Expected Conditional Characteristic Function-based Measures for Testing Independence
- Deep Network Approximation Characterized by Number of Neurons
- Algorithmic Learning Theory
- On Estimation Efficiency of the Central Mean Subspace
- A Class of Statistics with Asymptotically Normal Distribution
- Fréchet sufficient dimension reduction for random objects
- Comment
- Generalized martingale difference divergence: detecting conditional mean independence with applications in variable screening
- Deep dimension reduction for supervised representation learning
This page was built for publication: Deep nonlinear sufficient dimension reduction