An ETF view of Dropout regularization
From MaRDI portal
Publication:6308207
arXiv1810.06049MaRDI QIDQ6308207
Author name not available (Why is that?)
Publication date: 14 October 2018
Abstract: Dropout is a popular regularization technique in deep learning. Yet, the reason for its success is still not fully understood. This paper provides a new interpretation of Dropout from a frame theory perspective. By drawing a connection to recent developments in analog channel coding, we suggest that for a certain family of autoencoders with a linear encoder, optimizing the encoder with dropout regularization leads to an equiangular tight frame (ETF). Since this optimization is non-convex, we add another regularization that promotes such structures by minimizing the cross-correlation between filters in the network. We demonstrate its applicability in convolutional and fully connected layers in both feed-forward and recurrent networks. All these results suggest that there is indeed a relationship between dropout and ETF structure of the regularized linear operations.
Has companion code repository: https://github.com/dorbank/An-ETF-view-of-Dropout-Regularization
This page was built for publication: An ETF view of Dropout regularization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6308207)