Save: a method for dimension reduction and graphics in regression

From MaRDI portal
Publication:4550653

DOI10.1080/03610920008832598zbMath1061.62503OpenAlexW2087717467MaRDI QIDQ4550653

R. Dennis Cook

Publication date: 2000

Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1080/03610920008832598



Related Items

Sparse Sliced Inverse Regression via Cholesky Matrix Penalization, Data-guided Treatment Recommendation with Feature Scores, A Sliced Inverse Regression Approach for a Stratified Population, A new sliced inverse regression method for multivariate response, On kernel method for sliced average variance estimation, A data-adaptive hybrid method for dimension reduction, Unnamed Item, The hybrid method of FSIR and FSAVE for functional effective dimension reduction, On the optimality of sliced inverse regression in high dimensions, Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation, Variable importance assessment in sliced inverse regression for variable selection, SAVE: Robust or not?, Stationary subspace analysis based on second-order statistics, Conditional regression for single-index models, Missing data analysis with sufficient dimension reduction, Sufficient dimension reduction through informative predictor subspace, Level Set Learning with Pseudoreversible Neural Networks for Nonlinear Dimension Reduction in Function Approximation, Likelihood-based surrogate dimension reduction, Optimal quantization applied to sliced inverse regression, Estimating covariance and precision matrices along subspaces, Model free estimation of graphical model using gene expression data, The effect of data contamination in sliced inverse regression and finite sample breakdown point, Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities, Slice inverse regression with score functions, Using sliced mean variance-covariance inverse regression for classification and dimension reduction, Sliced average variance estimation for multivariate time series, Bagging Versions of Sliced Inverse Regression, Wavelet methods in statistics: some recent developments and their applications, Model-based reinforcement learning with dimension reduction, Asymptotics for sliced average variance estimation, On splines approximation for sliced average variance estimation, A graphical tool for selecting the number of slices and the dimension of the model in SIR and SAVE approaches, Dimension reduction regressions with measurement errors subject to additive distortion, Dimension Reduction via Gaussian Ridge Functions, Estimating multi-index models with response-conditional least squares, Sparse Sliced Inverse Regression Via Lasso, Machine learning with squared-loss mutual information, A new approach on recursive and non-recursive SIR methods, Isometric sliced inverse regression for nonlinear manifold learning, Inverse regression for ridge recovery: a data-driven approach for parameter reduction in computer experiments, On hybrid methods of inverse regression-based algorithms, Bayesian inverse regression for supervised dimension reduction with small datasets, On the usage of joint diagonalization in multivariate statistics, Advanced topics in sliced inverse regression, Fusing sufficient dimension reduction with neural networks, Sliced Average Variance Estimation for Censored Data, Dimension reduction based on weighted variance estimate, Gauss-Christoffel quadrature for inverse regression: applications to computer experiments, Computational Outlier Detection Methods in Sliced Inverse Regression, Smoothed average variance estimation for dimension reduction with functional data



Cites Work