An Information-Theoretic Justification for Model Pruning

From MaRDI portal
Publication:6360794

arXiv2102.08329MaRDI QIDQ6360794

Berivan Isik, Tsachy Weissman, Albert No

Publication date: 16 February 2021

Abstract: We study the neural network (NN) compression problem, viewing the tension between the compression ratio and NN performance through the lens of rate-distortion theory. We choose a distortion metric that reflects the effect of NN compression on the model output and derive the tradeoff between rate (compression) and distortion. In addition to characterizing theoretical limits of NN compression, this formulation shows that emph{pruning}, implicitly or explicitly, must be a part of a good compression algorithm. This observation bridges a gap between parts of the literature pertaining to NN and data compression, respectively, providing insight into the empirical success of model pruning. Finally, we propose a novel pruning strategy derived from our information-theoretic formulation and show that it outperforms the relevant baselines on CIFAR-10 and ImageNet datasets.




Has companion code repository: https://github.com/BerivanIsik/SuRP








This page was built for publication: An Information-Theoretic Justification for Model Pruning

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6360794)