Information Plane Analysis for Dropout Neural Networks

From MaRDI portal
Publication:6428087

arXiv2303.00596MaRDI QIDQ6428087

Author name not available (Why is that?)

Publication date: 1 March 2023

Abstract: The information-theoretic framework promises to explain the predictive power of neural networks. In particular, the information plane analysis, which measures mutual information (MI) between input and representation as well as representation and output, should give rich insights into the training process. This approach, however, was shown to strongly depend on the choice of estimator of the MI. The problem is amplified for deterministic networks if the MI between input and representation is infinite. Thus, the estimated values are defined by the different approaches for estimation, but do not adequately represent the training process from an information-theoretic perspective. In this work, we show that dropout with continuously distributed noise ensures that MI is finite. We demonstrate in a range of experiments that this enables a meaningful information plane analysis for a class of dropout neural networks that is widely used in practice.




Has companion code repository: https://github.com/link-er/ip_dropout








This page was built for publication: Information Plane Analysis for Dropout Neural Networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6428087)