Simultaneous feature selection and clustering based on square root optimization
From MaRDI portal
Publication:2028812
DOI10.1016/j.ejor.2020.06.045zbMath1487.62086OpenAlexW3039543396MaRDI QIDQ2028812
Yao Dong, Shi Hua Luo, He Jiang
Publication date: 3 June 2021
Published in: European Journal of Operational Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ejor.2020.06.045
clusteringalternating direction method of multipliersfeature selectionanalyticssquare root fused Lasso
Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Applications of mathematical programming (90C90)
Related Items (3)
Preference estimation under bounded rationality: identification of attribute non-attendance in stated-choice data using a support vector machines approach ⋮ Sparse and robust estimation with ridge minimax concave penalty ⋮ Dendrograms, minimum spanning trees and feature selection
Cites Work
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- The Adaptive Lasso and Its Oracle Properties
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A stepwise regression method and consistent model selection for high-dimensional sparse linear models
- Feature selection for support vector machines using generalized Benders decomposition
- Oracle inequalities and optimal inference under group sparsity
- Properties and refinements of the fused Lasso
- Estimating the dimension of a model
- On the approximability of minimizing nonzero variables or unsatisfied relations in linear systems
- Sparse spatial autoregressions
- High dimensional data classification and feature selection using support vector machines
- Least angle regression. (With discussion)
- The risk inflation criterion for multiple regression
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Pathwise coordinate optimization
- High-dimensional graphs and variable selection with the Lasso
- Forward Regression for Ultra-High Dimensional Variable Screening
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Extended Bayesian information criteria for model selection with large model spaces
- Spatial smoothing and hot spot detection for CGH data using the fused lasso
- Bayes Model Averaging with Selection of Regressors
- Sparsity and Smoothness Via the Fused Lasso
- Statistics for high-dimensional data
- Regularization and Variable Selection Via the Elastic Net
- The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms
- Model Selection and Estimation in Regression with Grouped Variables
- Some Comments on C P
- A new look at the statistical model identification
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Simultaneous feature selection and clustering based on square root optimization