Neural Architecture Search via Bregman Iterations

From MaRDI portal
Publication:6369451

arXiv2106.02479MaRDI QIDQ6369451

Author name not available (Why is that?)

Publication date: 4 June 2021

Abstract: We propose a novel strategy for Neural Architecture Search (NAS) based on Bregman iterations. Starting from a sparse neural network our gradient-based one-shot algorithm gradually adds relevant parameters in an inverse scale space manner. This allows the network to choose the best architecture in the search space which makes it well-designed for a given task, e.g., by adding neurons or skip connections. We demonstrate that using our approach one can unveil, for instance, residual autoencoders for denoising, deblurring, and classification tasks. Code is available at https://github.com/TimRoith/BregmanLearning.




Has companion code repository: https://github.com/TimRoith/BregmanLearning

No records found.








This page was built for publication: Neural Architecture Search via Bregman Iterations

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6369451)