Sequential Scaled Sparse Factor Regression
From MaRDI portal
Publication:6620886
DOI10.1080/07350015.2020.1844212zbMATH Open1547.62987MaRDI QIDQ6620886
Yang Li, Zemin Zheng, Yu-Chen Wang, Jie Wu
Publication date: 17 October 2024
Published in: Journal of Business and Economic Statistics (Search for Journal in Brave)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Reduced rank regression via adaptive nuclear norm penalization
- A unified approach to model selection and sparse recovery using regularized least squares
- The Adaptive Lasso and Its Oracle Properties
- Sparse principal component analysis and iterative thresholding
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Optimal selection of reduced rank estimators of high-dimensional matrices
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- Reduced-rank regression for the multivariate linear model
- Low rank multivariate regression
- Asymptotics of empirical eigenstructure for high dimensional spiked covariance
- Parametric and semiparametric reduced-rank regression with flexible sparsity
- Network exploration via the adaptive LASSO and SCAD penalties
- Generalized high-dimensional trace regression via nuclear norm regularization
- Simultaneous analysis of Lasso and Dantzig selector
- Bayesian sparse reduced rank multivariate regression
- Adaptive robust variable selection
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Rate minimaxity of the Lasso and Dantzig selector for the \(l_{q}\) loss in \(l_{r}\) balls
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Decoding by Linear Programming
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Regularization after retention in ultrahigh dimensional linear regression models
- Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection
- Dimension Reduction and Coefficient Estimation in Multivariate Linear Regression
- SOFAR: Large-Scale Association Network Learning
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Asymptotic properties for combined L1 and concave regularization
- Reduced Rank Stochastic Regression with a Sparse Singular value Decomposition
- Tuning Parameter Selection in High Dimensional Penalized Likelihood
- Estimating Linear Restrictions on Regression Coefficients for Multivariate Normal Distributions
- Projected principal component analysis in factor models
This page was built for publication: Sequential Scaled Sparse Factor Regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6620886)