Estimating covariance and precision matrices along subspaces
DOI10.1214/20-EJS1782zbMath1461.62067arXiv1909.12218MaRDI QIDQ2219236
Publication date: 19 January 2021
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1909.12218
rate of convergencedimension reductionfinite sample boundscovariance matrixordinary least squaresprecision matrixsingle-index model
Nonparametric regression and quantile regression (62G08) Estimation in multivariate analysis (62H12) Nonparametric estimation (62G05) Generalized linear models (logistic models) (62J12) Analysis of variance and covariance (ANOVA) (62J10)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Generalized Ridge Estimator of the Inverse Covariance Matrix
- Sparse inverse covariance estimation with the graphical lasso
- Estimating sparse precision matrix: optimal rates of convergence and adaptive estimation
- Covariance estimation for distributions with \({2+\varepsilon}\) moments
- Concentration inequalities and moment bounds for sample covariance operators
- Normal approximation and concentration of spectral projectors of sample covariance
- How close is the sample covariance matrix to the actual covariance matrix?
- The optimal perturbation bounds of the Moore-Penrose inverse under the Frobenius norm
- Covariance regularization by thresholding
- Semiparametric least squares (SLS) and weighted SLS estimation of single-index models
- Some refined bounds for the perturbation of the orthogonal projection and the generalized inverse
- Regression analysis under link violation
- A distribution-free theory of nonparametric regression
- Direct estimation of the index coefficient in a single-index model
- Efficient estimation of linear functionals of principal components
- Nonasymptotic upper bounds for the reconstruction error of PCA
- Fast and adaptive sparse precision matrix estimation in high dimensions
- Efficient estimation in single index models through smoothing splines
- High dimensional single index models
- Multiscale geometric methods for data sets. I: Multiscale SVD, noise and curvature.
- Regularized estimation of large covariance matrices
- High-dimensional graphs and variable selection with the Lasso
- Lower bounds on the smallest eigenvalue of a sample covariance matrix.
- The Generalized Lasso With Non-Linear Observations
- A Constrainedℓ1Minimization Approach to Sparse Precision Matrix Estimation
- Sufficient dimension reduction and prediction in regression
- Quantitative estimates of the convergence of the empirical covariance matrix in log-concave ensembles
- On Directional Regression for Dimension Reduction
- Sliced Inverse Regression for Dimension Reduction
- Save: a method for dimension reduction and graphics in regression
- High-dimensional estimation with geometric constraints: Table 1.
- High-Dimensional Probability
- Some identities for Moore–Penrose inverses of matrix products
- A Review on Dimension Reduction
- An overview of the estimation of large covariance and precision matrices
- On strict sub-Gaussianity, optimal proxy variance and symmetry for bounded random variables
- A useful variant of the Davis–Kahan theorem for statisticians
- Estimating surface normals in noisy point cloud data
- Score estimation in the monotone single‐index model
- Covariance matrix selection and estimation via penalised normal likelihood
- Perturbation theory for pseudo-inverses
- Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
This page was built for publication: Estimating covariance and precision matrices along subspaces