D-trace estimation of a precision matrix using adaptive lasso penalties
From MaRDI portal
Publication:2418368
DOI10.1007/s11634-016-0272-8zbMath1414.62224OpenAlexW2516747027MaRDI QIDQ2418368
Vahe Avagyan, Francisco J. Nogales, Andrés M. Alonso
Publication date: 3 June 2019
Published in: Advances in Data Analysis and Classification. ADAC (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/10016/32938
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Analysis of variance and covariance (ANOVA) (62J10) Graphical methods in numerical analysis (65S05)
Related Items (4)
D-Trace estimation of a precision matrix with eigenvalue control ⋮ Precision matrix estimation under data contamination with an application to minimum variance portfolio selection ⋮ Precision matrix estimation using penalized generalized Sylvester matrix equation ⋮ Regularized estimation of precision matrix for high-dimensional multivariate longitudinal data
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sparse inverse covariance estimation with the graphical lasso
- The Adaptive Lasso and Its Oracle Properties
- A well-conditioned estimator for large-dimensional covariance matrices
- Nonlinear shrinkage estimation of large-dimensional covariance matrices
- Adjusting for high-dimensional covariates in sparse precision matrix estimation by \(\ell_1\)-penalization
- Dominating estimators for minimum-variance portfolios
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Operator norm consistent estimation of large-dimensional sparse covariance matrices
- Relaxed Lasso
- Sparsistency and rates of convergence in large covariance matrix estimation
- Empirical Bayes estimation of the multivariate normal covariance matrix
- A joint convex penalty for inverse covariance matrix estimation
- Nonparametric Stein-type shrinkage covariance matrix estimators in high-dimensional settings
- Sparse estimation of high-dimensional correlation matrices
- On the distribution of the largest eigenvalue in principal components analysis
- Adaptive covariance matrix estimation through block thresholding
- Sparse permutation invariant covariance estimation
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Bayesian structure learning in graphical models
- Computationally efficient banding of large covariance matrices for ordered data and connections to banding the inverse Cholesky factor
- Network exploration via the adaptive LASSO and SCAD penalties
- Outlier detection and robust covariance estimation using mathematical programming
- Regularized estimation of large covariance matrices
- High-dimensional graphs and variable selection with the Lasso
- A Gaussian graphical model approach to climate networks
- Positive definite estimators of large covariance matrices
- A Constrainedℓ1Minimization Approach to Sparse Precision Matrix Estimation
- Model selection and estimation in the Gaussian graphical model
- First-Order Methods for Sparse Covariance Selection
- Penalized Normal Likelihood and Ridge Regularization of Correlation and Covariance Matrices
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Positive-Definite ℓ1-Penalized Estimation of Large Covariance Matrices
- Generalized Thresholding of Large Covariance Matrices
- Sparse precision matrix estimation via lasso penalized D-trace loss
This page was built for publication: D-trace estimation of a precision matrix using adaptive lasso penalties