Semialgebraic Optimization for Lipschitz Constants of ReLU Networks

From MaRDI portal
Publication:6334469

arXiv2002.03657MaRDI QIDQ6334469

Author name not available (Why is that?)

Publication date: 10 February 2020

Abstract: The Lipschitz constant of a network plays an important role in many applications of deep learning, such as robustness certification and Wasserstein Generative Adversarial Network. We introduce a semidefinite programming hierarchy to estimate the global and local Lipschitz constant of a multiple layer deep neural network. The novelty is to combine a polynomial lifting for ReLU functions derivatives with a weak generalization of Putinar's positivity certificate. This idea could also apply to other, nearly sparse, polynomial optimization problems in machine learning. We empirically demonstrate that our method provides a trade-off with respect to state of the art linear programming approach, and in some cases we obtain better bounds in less time.




Has companion code repository: https://github.com/TongCHEN779/CertDNN








This page was built for publication: Semialgebraic Optimization for Lipschitz Constants of ReLU Networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6334469)