Safe Feature Elimination in Sparse Supervised Learning
From MaRDI portal
Publication:4906144
zbMath1259.65010arXiv1009.3515MaRDI QIDQ4906144
No author found.
Publication date: 7 February 2013
Full work available at URL: https://arxiv.org/abs/1009.3515
Related Items
Safe Rules for the Identification of Zeros in the Solutions of the SLOPE Problem, Nonsmoothness in machine learning: specific structure, proximal identification, and applications, Accelerated, Parallel, and Proximal Coordinate Descent, Thresholding least-squares inference in high-dimensional regression models, A Scalable Hierarchical Lasso for Gene–Environment Interactions, A Pliable Lasso, Unnamed Item, Unnamed Item, Natural coordinate descent algorithm for \(\ell_1\)-penalised regression in generalised linear models, Screening for a reweighted penalized conditional gradient method, Safe feature screening rules for the regularized Huber regression, Adaptive hybrid screening for efficient lasso optimization, Fast stepwise regression based on multidimensional indexes, On the distribution, model selection properties and uniqueness of the Lasso estimator in low and high dimensions, ``FISTA in Banach spaces with adaptive discretisations, A novel ramp loss-based multi-task twin support vector machine with multi-parameter safe acceleration, Sparse identification of posynomial models, Estimation of semiparametric regression model with right-censored high-dimensional data, Smooth over-parameterized solvers for non-smooth structured optimization, Screening Rules and its Complexity for Active Set Identification, Proximal gradient/semismooth Newton methods for projection onto a polyhedron via the duality-gap-active-set strategy, A safe double screening strategy for elastic net support vector machine, Algorithms for Sparse Support Vector Machines, Scaling up twin support vector regression with safe screening rule, A safe reinforced feature screening strategy for Lasso based on feasible solutions, Unnamed Item, An active-set proximal-Newton algorithm for \(\ell_1\) regularized optimization problems with box constraints, Concise comparative summaries (CCS) of large text corpora with a human experiment, Solving a class of feature selection problems via fractional 0--1 programming, Unnamed Item, The sliding Frank–Wolfe algorithm and its application to super-resolution microscopy, Double fused Lasso regularized regression with both matrix and vector valued predictors, Regularization parameter selection for the low rank matrix recovery, Safe Triplet Screening for Distance Metric Learning, Understanding large text corpora via sparse machine learning, Conducting sparse feature selection on arbitrarily long phrases in text corpora with a focus on interpretability, Safe feature elimination for non-negativity constrained convex optimization, A hybrid acceleration strategy for nonparallel support vector machine, Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent, Distance metric learning for graph structured data, A safe screening rule for accelerating weighted twin support vector machine, Gap Safe screening rules for sparsity enforcing penalties, Unnamed Item, Unnamed Item, A decomposition method for Lasso problems with zero-sum constraint
Uses Software