Partition-based formulations for mixed-integer optimization of trained ReLU neural networks
From MaRDI portal
Publication:6360139
arXiv2102.04373MaRDI QIDQ6360139
Author name not available (Why is that?)
Publication date: 8 February 2021
Abstract: This paper introduces a class of mixed-integer formulations for trained ReLU neural networks. The approach balances model size and tightness by partitioning node inputs into a number of groups and forming the convex hull over the partitions via disjunctive programming. At one extreme, one partition per input recovers the convex hull of a node, i.e., the tightest possible formulation for each node. For fewer partitions, we develop smaller relaxations that approximate the convex hull, and show that they outperform existing formulations. Specifically, we propose strategies for partitioning variables based on theoretical motivations and validate these strategies using extensive computational experiments. Furthermore, the proposed scheme complements known algorithmic approaches, e.g., optimization-based bound tightening captures dependencies within a partition.
Has companion code repository: https://github.com/cog-imperial/partitionedformulations_nn
This page was built for publication: Partition-based formulations for mixed-integer optimization of trained ReLU neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6360139)