Efficient and sparse neural networks by pruning weights in a multiobjective learning approach
From MaRDI portal
Publication:2669736
DOI10.1016/j.cor.2021.105676OpenAlexW3082139599WikidataQ111522174 ScholiaQ111522174MaRDI QIDQ2669736
Malena Reiners, Fabian Heldmann, Michael Stiglmayr, Kathrin Klamroth
Publication date: 9 March 2022
Published in: Computers \& Operations Research (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2008.13590
automated machine learning\(l_1\)-regularizationmultiobjective learningstochastic multi-gradient descentunstructured pruning
Related Items (4)
Adaptive sampling stochastic multigradient algorithm for stochastic multiobjective optimization ⋮ PINN training using biobjective optimization: the trade-off between data loss and residual loss ⋮ Optimal deep neural networks by maximization of the approximation power ⋮ Accuracy and fairness trade-offs in machine learning: a stochastic multi-objective approach
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Stochastic approach versus multiobjective approach for obtaining efficient solutions in stochastic multiobjective programming problems
- Steepest descent methods for multicriteria optimization.
- A stochastic multiple gradient descent algorithm
- Improving generalization of MLPs with multi-objective optimization
- Multi-objective machine learning
- Bicriteria Transportation Problem
- Optimization Methods for Large-Scale Machine Learning
- Multicriteria Optimization
- Asymptotic Distribution of Stochastic Approximation Procedures
- A Stochastic Approximation Method
This page was built for publication: Efficient and sparse neural networks by pruning weights in a multiobjective learning approach