ReduNet: A White-box Deep Network from the Principle of Maximizing Rate Reduction

From MaRDI portal
Publication:6368234

arXiv2105.10446MaRDI QIDQ6368234

Author name not available (Why is that?)

Publication date: 21 May 2021

Abstract: This work attempts to provide a plausible theoretical framework that aims to interpret modern deep (convolutional) networks from the principles of data compression and discriminative representation. We argue that for high-dimensional multi-class data, the optimal linear discriminative representation maximizes the coding rate difference between the whole dataset and the average of all the subsets. We show that the basic iterative gradient ascent scheme for optimizing the rate reduction objective naturally leads to a multi-layer deep network, named ReduNet, which shares common characteristics of modern deep networks. The deep layered architectures, linear and nonlinear operators, and even parameters of the network are all explicitly constructed layer-by-layer via forward propagation, although they are amenable to fine-tuning via back propagation. All components of so-obtained "white-box" network have precise optimization, statistical, and geometric interpretation. Moreover, all linear operators of the so-derived network naturally become multi-channel convolutions when we enforce classification to be rigorously shift-invariant. The derivation in the invariant setting suggests a trade-off between sparsity and invariance, and also indicates that such a deep convolution network is significantly more efficient to construct and learn in the spectral domain. Our preliminary simulations and experiments clearly verify the effectiveness of both the rate reduction objective and the associated ReduNet. All code and data are available at url{https://github.com/Ma-Lab-Berkeley}.




Has companion code repository: https://github.com/Ma-Lab-Berkeley/MCR2








This page was built for publication: ReduNet: A White-box Deep Network from the Principle of Maximizing Rate Reduction

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6368234)