High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion
DOI10.1007/S11222-024-10399-4zbMATH Open1539.62014MaRDI QIDQ6547751
Xin Chen, Shuaida He, Runxiong Wu, Chang Deng, J. Zhang
Publication date: 31 May 2024
Published in: Statistics and Computing (Search for Journal in Brave)
variable selectionsingle-index modelsmajorization-minimizationsufficient dimension reductionlarge \(p\) small \(n\)Hilbert-Schmidt independence criterion
Computational methods for problems pertaining to statistics (62-08) Nonparametric regression and quantile regression (62G08) Estimation in multivariate analysis (62H12)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Sparse Generalized Eigenvalue Problem: Optimal Statistical Rates via Truncated Rayleigh Flow
- Sliced Regression for Dimension Reduction
- Sparse Sliced Inverse Regression Via Lasso
- Sufficient dimension reduction based on an ensemble of minimum average variance estimators
- Coordinate-independent sparse sufficient dimension reduction and variable selection
- A unified primal-dual algorithm framework based on Bregman iteration
- MM algorithms for distance covariance based sufficient dimension reduction and sufficient variable selection
- Principal fitted components for dimension reduction in regression
- Successive direction extraction for estimating the central subspace in a multiple-index regres\-sion
- An integral transform method for estimating the central mean and central subspaces
- Sparse CCA: adaptive estimation and computational barriers
- Estimating a sparse reduction for general regression in high dimensions
- On consistency and sparsity for sliced inverse regression in high dimensions
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Regression analysis under link violation
- Testing predictor contributions in sufficient dimension reduction.
- Sparse SIR: optimal rates and adaptive estimation
- Generalized alternating direction method of multipliers: new theoretical insights and applications
- Approximation Theorems of Mathematical Statistics
- Fourier Methods for Estimating the Central Subspace and the Central Mean Subspace in Regression
- Direction Estimation in Single-Index Regressions via Hilbert-Schmidt Independence Criterion
- On Directional Regression for Dimension Reduction
- Sliced Inverse Regression for Dimension Reduction
- On the Interpretation of Regression Plots
- Graphics for Regressions With a Binary Response
- A convex formulation for high-dimensional sparse sliced inverse regression
- Feature Screening via Distance Correlation Learning
- An Adaptive Estimation of Dimension Reduction Space
- The Linearized Alternating Direction Method of Multipliers for Dantzig Selector
- Linearized augmented Lagrangian and alternating direction methods for nuclear norm minimization
- A Semiparametric Approach to Dimension Reduction
- A Review on Dimension Reduction
- Sparse Minimum Discrepancy Approach to Sufficient Dimension Reduction with Simultaneous Variable Selection in Ultrahigh Dimension
- Likelihood-Based Sufficient Dimension Reduction
- Model-Free Variable Selection
- A note on shrinkage sliced inverse regression
- Sequential Sufficient Dimension Reduction for Large p, Small n Problems
- Algorithmic Learning Theory
- Sparse sufficient dimension reduction
- Sliced Inverse Regression with Regularizations
- Sufficient Dimension Reduction via Inverse Regression
- Comment
- Discussion of: Brownian distance covariance
- Subspace Estimation with Automatic Dimension and Variable Selection in Sufficient Dimension Reduction
- Efficient Sparse Estimate of Sufficient Dimension Reduction in High Dimension
This page was built for publication: High-dimensional sparse single-index regression via Hilbert-Schmidt independence criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6547751)