Sequential safe feature elimination rule for \(L_1\)-regularized regression with Kullback-Leibler divergence
From MaRDI portal
Publication:6488733
DOI10.1016/J.NEUNET.2022.09.008MaRDI QIDQ6488733
Hongmei Wang, Yitian Xu, Kun Jiang
Publication date: 18 October 2023
Published in: Neural Networks (Search for Journal in Brave)
Cites Work
- A safe reinforced feature screening strategy for Lasso based on feasible solutions
- Safe feature screening rules for the regularized Huber regression
- Safe Feature Elimination in Sparse Supervised Learning
- Multiplicative Updates for NMF with $\beta$-Divergences under Disjoint Equality Constraints
- This is SPIRAL-TAP: Sparse Poisson Intensity Reconstruction ALgorithms—Theory and Practice
- Strong Rules for Discarding Predictors in Lasso-Type Problems
- A statistical framework for non-negative matrix factorization based on generalized dual divergence
This page was built for publication: Sequential safe feature elimination rule for \(L_1\)-regularized regression with Kullback-Leibler divergence