Learning with Differentiable Perturbed Optimizers
From MaRDI portal
Publication:6335205
arXiv2002.08676MaRDI QIDQ6335205
Author name not available (Why is that?)
Publication date: 20 February 2020
Abstract: Machine learning pipelines often rely on optimization procedures to make discrete decisions (e.g., sorting, picking closest neighbors, or shortest paths). Although these discrete decisions are easily computed, they break the back-propagation of computational graphs. In order to expand the scope of learning problems that can be solved in an end-to-end fashion, we propose a systematic method to transform optimizers into operations that are differentiable and never locally constant. Our approach relies on stochastically perturbed optimizers, and can be used readily together with existing solvers. Their derivatives can be evaluated efficiently, and smoothness tuned via the chosen noise amplitude. We also show how this framework can be connected to a family of losses developed in structured prediction, and give theoretical guarantees for their use in learning tasks. We demonstrate experimentally the performance of our approach on various tasks.
Has companion code repository: https://github.com/tuero/perturbations-differential-pytorch
This page was built for publication: Learning with Differentiable Perturbed Optimizers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6335205)