Quantile regression with deep ReLU Networks: Estimators and minimax rates

From MaRDI portal
Publication:6351469

arXiv2010.08236MaRDI QIDQ6351469

Author name not available (Why is that?)

Publication date: 16 October 2020

Abstract: Quantile regression is the task of estimating a specified percentile response, such as the median, from a collection of known covariates. We study quantile regression with rectified linear unit (ReLU) neural networks as the chosen model class. We derive an upper bound on the expected mean squared error of a ReLU network used to estimate any quantile conditional on a set of covariates. This upper bound only depends on the best possible approximation error, the number of layers in the network, and the number of nodes per layer. We further show upper bounds that are tight for two large classes of functions: compositions of H"older functions and members of a Besov space. These tight bounds imply ReLU networks with quantile regression achieve minimax rates for broad collections of function types. Unlike existing work, the theoretical results hold under minimal assumptions and apply to general error distributions, including heavy-tailed distributions. Empirical simulations on a suite of synthetic response functions demonstrate the theoretical results translate to practical implementations of ReLU networks. Overall, the theoretical and empirical results provide insight into the strong performance of ReLU neural networks for quantile regression across a broad range of function classes and error distributions. All code for this paper is publicly available at https://github.com/tansey/quantile-regression.




Has companion code repository: https://github.com/tansey/quantile-regression








This page was built for publication: Quantile regression with deep ReLU Networks: Estimators and minimax rates

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6351469)