D4R: doubly robust reduced rank regression in high dimension
From MaRDI portal
Publication:6556782
DOI10.1016/j.jspi.2024.106162MaRDI QIDQ6556782
Xiao-Yan Ma, Lili Wei, Wanfeng Liang
Publication date: 17 June 2024
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis
- Reduced rank regression via adaptive nuclear norm penalization
- Nearly unbiased variable selection under minimax concave penalty
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Robust reduced-rank modeling via rank regression
- Statistics for high-dimensional data. Methods, theory and applications.
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Optimal selection of reduced rank estimators of high-dimensional matrices
- High breakdown-point and high efficiency robust estimates for regression
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery
- Regularized multivariate regression for identifying master predictors with application to integrative genomics study of breast cancer
- Reduced-rank regression for the multivariate linear model
- Multivariate reduced-rank regression
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- Robust reduced rank regression in a distributed setting
- Dependence in elliptical partial correlation graphs
- Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- High-Dimensional Statistics
- Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection
- Reduced rank ridge regression and its kernel extensions
- Dimension Reduction and Coefficient Estimation in Multivariate Linear Regression
- Regularized Matrix Regression
- Estimation of High Dimensional Mean Regression in the Absence of Symmetry and Light Tail Assumptions
- Selective factor extraction in high dimensions
- Robust reduced-rank regression
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Reduced Rank Stochastic Regression with a Sparse Singular value Decomposition
- Estimation of Low Rank High-Dimensional Multivariate Linear Models for Multi-Response Data
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Robust Lasso Regression Using Tukey's Biweight Criterion
This page was built for publication: D4R: doubly robust reduced rank regression in high dimension