scientific article; zbMATH DE number 7750672
From MaRDI portal
Publication:6073211
DOI10.11329/jjssj.53.69MaRDI QIDQ6073211
Publication date: 17 October 2023
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Predictive learning via rule ensembles
- Missing values: sparse inverse covariance estimation and an extension to sparse regression
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Nearly unbiased variable selection under minimax concave penalty
- Sparse inverse covariance estimation with the graphical lasso
- The Adaptive Lasso and Its Oracle Properties
- Best subset selection via a modern optimization lens
- Statistics for high-dimensional data. Methods, theory and applications.
- The solution path of the generalized lasso
- CoCoLasso for high-dimensional error-in-variables regression
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Relaxed Lasso
- High-dimensional additive modeling
- High-dimensional semiparametric Gaussian copula graphical models
- Sparse regression with exact clustering
- Interaction pursuit in high-dimensional multi-response regression via distance correlation
- Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- Fast rates by transferring from auxiliary hypotheses
- Pathwise coordinate optimization
- A Constrainedℓ1Minimization Approach to Sparse Precision Matrix Estimation
- Model selection and estimation in the Gaussian graphical model
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- High-Dimensional Statistics
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Sparse Additive Models
- Sparse Partial Least Squares Regression for Simultaneous Dimension Reduction and Variable Selection
- Stability Selection
- Regression Shrinkage and Selection via The Lasso: A Retrospective
- Statistical Analysis with Missing Data, Third Edition
- Gap Safe screening rules for sparsity enforcing penalties
- Feature Screening via Distance Correlation Learning
- Sparsity and Smoothness Via the Fused Lasso
- Safe Feature Elimination in Sparse Supervised Learning
- Independently Interpretable Lasso for Generalized Linear Models
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Regularization and Variable Selection Via the Elastic Net
- High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso
- Model Selection and Estimation in Regression with Grouped Variables
- Simultaneous Regression Shrinkage, Variable Selection, and Supervised Clustering of Predictors with OSCAR
- Strong Rules for Discarding Predictors in Lasso-Type Problems
- Missing-Data Methods for Generalized Linear Models
This page was built for publication: