Gradient-Based Kernel Dimension Reduction for Regression
From MaRDI portal
Publication:4975356
DOI10.1080/01621459.2013.838167zbMath1367.62118OpenAlexW2036163871MaRDI QIDQ4975356
Publication date: 4 August 2017
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01621459.2013.838167
Related Items (22)
A selective overview of sparse sufficient dimension reduction ⋮ Efficient kernel-based variable selection with sparsistency ⋮ Functional sufficient dimension reduction through average Fréchet derivatives ⋮ Minimal \(\sigma\)-field for flexible sufficient dimension reduction ⋮ Artificial neural network based response surface for data-driven dimensional analysis ⋮ Central subspaces review: methods and applications ⋮ Variable selection based on squared derivative averages ⋮ An Outer-Product-of-Gradient Approach to Dimension Reduction and its Application to Classification in High Dimensional Space ⋮ On forward sufficient dimension reduction for categorical and ordinal responses ⋮ Direct Estimation of the Derivative of Quadratic Mutual Information with Application in Supervised Dimension Reduction ⋮ Sufficient dimension reduction with simultaneous estimation of effective dimensions for time-to-event data ⋮ Dimension Reduction for Gaussian Process Emulation: An Application to the Influence of Bathymetry on Tsunami Heights ⋮ Testing if a nonlinear system is additive or not ⋮ Supervised dimensionality reduction via distance correlation maximization ⋮ Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities ⋮ Sequential Learning of Active Subspaces ⋮ Data-Driven Polynomial Ridge Approximation Using Variable Projection ⋮ Identifying outliers using multiple kernel canonical correlation analysis with application to imaging genetics ⋮ Model-based reinforcement learning with dimension reduction ⋮ Linearity identification for general partial linear single-index models ⋮ Gaussian Process-Based Dimension Reduction for Goal-Oriented Sequential Design ⋮ Learning sparse conditional distribution: an efficient kernel-based approach
Cites Work
- Unnamed Item
- Unnamed Item
- Principal support vector machines for linear and nonlinear sufficient dimension reduction
- On regularization algorithms in learning theory
- Kernel methods in machine learning
- An RKHS formulation of the inverse regression dimension-reduction problem
- Optimal rates for the regularized least-squares algorithm
- Kernel dimension reduction in regression
- Asymptotic behavior of the eigenvalues of certain integral equations. II
- Shannon sampling. II: Connections to learning theory
- Contour regression: a general approach to dimension reduction
- Learning theory estimates via integral operators and their approximations
- A Note on Sliced Inverse Regression with Regularizations
- On Directional Regression for Dimension Reduction
- Extending Sliced Inverse Regression
- An Adaptive Estimation of Dimension Reduction Space
- Discussion of: Brownian distance covariance
This page was built for publication: Gradient-Based Kernel Dimension Reduction for Regression