An Outer-Product-of-Gradient Approach to Dimension Reduction and its Application to Classification in High Dimensional Space
From MaRDI portal
Publication:6077561
DOI10.1080/01621459.2021.2003202OpenAlexW3211651822MaRDI QIDQ6077561
Yingcun Xia, Unnamed Author, Unnamed Author
Publication date: 18 October 2023
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://figshare.com/articles/journal_contribution/An_Outer-Product-of-Gradient_Approach_to_Dimension_Reduction_and_Its_Application_to_Classification_in_High_Dimensional_Space/16967322
Cites Work
- Unnamed Item
- Unnamed Item
- Measuring and testing dependence by correlation of distances
- Bagging predictors
- Efficient estimation in sufficient dimension reduction
- A constructive approach to the estimation of dimension reduction directions
- Sparse Sliced Inverse Regression Via Lasso
- On marginal sliced inverse regression for ultrahigh dimensional model-free feature selection
- A survey of multilinear subspace learning for tensor data
- Optimal weighted nearest neighbour classifiers
- Component selection and smoothing in multivariate nonparametric regression
- Successive direction extraction for estimating the central subspace in a multiple-index regres\-sion
- On consistency and sparsity for sliced inverse regression in high dimensions
- Principal manifolds for data visualization and dimension reduction. Reviews and original papers presented partially at the workshop `Principal manifolds for data cartography and dimension reduction', Leicester, UK, August 24--26, 2006.
- Kernel dimension reduction in regression
- Rodeo: Sparse, greedy nonparametric regression
- Contour regression: a general approach to dimension reduction
- Extensions of Lipschitz mappings into a Hilbert space
- On Directional Regression for Dimension Reduction
- Sliced Inverse Regression for Dimension Reduction
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- A convex formulation for high-dimensional sparse sliced inverse regression
- Multiclass Sparse Discriminant Analysis
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Feature Screening via Distance Correlation Learning
- An elementary proof of a theorem of Johnson and Lindenstrauss
- A Semiparametric Approach to Dimension Reduction
- Gradient-Based Kernel Dimension Reduction for Regression
- Sparse Minimum Discrepancy Approach to Sufficient Dimension Reduction with Simultaneous Variable Selection in Ultrahigh Dimension
- Random-projection Ensemble Classification
- Sequential Sufficient Dimension Reduction for Large p, Small n Problems
- Direction estimation in single-index regressions
- Comment
- Random forests
This page was built for publication: An Outer-Product-of-Gradient Approach to Dimension Reduction and its Application to Classification in High Dimensional Space