Lasso, fractional norm and structured sparse estimation using a Hadamard product parametrization
From MaRDI portal
Publication:1658387
DOI10.1016/j.csda.2017.06.007zbMath1466.62098arXiv1611.00040OpenAlexW2546982352WikidataQ114671397 ScholiaQ114671397MaRDI QIDQ1658387
Publication date: 14 August 2018
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1611.00040
optimizationridge regressionlinear regressionspatial autocorrelationsparsitygeneralized linear modelcyclic coordinate descent
Related Items (3)
Smooth over-parameterized solvers for non-smooth structured optimization ⋮ Neuronized Priors for Bayesian Sparse Linear Regression ⋮ Understanding Implicit Regularization in Over-Parameterized Single Index Model
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Linear and nonlinear programming.
- One-step sparse estimates in nonconcave penalized likelihood models
- The Lasso problem and uniqueness
- Hadamard products and multivariate statistical analysis
- Variable selection using MM algorithms
- Inference with normal-gamma prior distributions in regression problems
- The Bayesian Lasso
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sparsity and Smoothness Via the Fused Lasso
- Model Selection and Estimation in Regression with Grouped Variables
- On a Factorisation of Positive Definite Matrices
This page was built for publication: Lasso, fractional norm and structured sparse estimation using a Hadamard product parametrization