Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression
From MaRDI portal
Publication:6086172
DOI10.5705/ss.202021.0140WikidataQ114013796 ScholiaQ114013796MaRDI QIDQ6086172
Wenqi Lu, Zhong-yi Zhu, Heng Lian
Publication date: 9 November 2023
Published in: Statistica Sinica (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A lasso for hierarchical interactions
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Von Neumann entropy penalization and low-rank matrix estimation
- Statistics for high-dimensional data. Methods, theory and applications.
- Estimation of high-dimensional low-rank matrices
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Optimal selection of reduced rank estimators of high-dimensional matrices
- Incremental proximal methods for large scale convex optimization
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- Globally adaptive quantile regression with ultra-high dimensional data
- Convex multi-task feature learning
- The composite absolute penalties family for grouped and hierarchical variable selection
- Conditional quantile processes based on series or many regressors
- Innovated interaction screening for high-dimensional nonlinear classification
- Quantile processes for semi and nonparametric regression
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- High-dimensional graphs and variable selection with the Lasso
- Multiplier bootstrap for quantile regression: non-asymptotic theory under random design
- Regression Quantiles
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Model Selection for High-Dimensional Quadratic Regression via Regularization
- Interaction Screening for Ultrahigh-Dimensional Data
- Adaptive Estimation in Two-way Sparse Reduced-rank Regression
- Dimension Reduction and Coefficient Estimation in Multivariate Linear Regression
- Regularized Matrix Regression
- Variable Selection With the Strong Heredity Constraint and Its Oracle Property
- Regularization and Variable Selection Via the Elastic Net
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Reduced Rank Stochastic Regression with a Sparse Singular value Decomposition
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
This page was built for publication: Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression