Generic Bounds on the Maximum Deviations in Sequential Prediction: An Information-Theoretic Analysis
From MaRDI portal
Publication:6327240
arXiv1910.06742MaRDI QIDQ6327240
Author name not available (Why is that?)
Publication date: 11 October 2019
Abstract: In this paper, we derive generic bounds on the maximum deviations in prediction errors for sequential prediction via an information-theoretic approach. The fundamental bounds are shown to depend only on the conditional entropy of the data point to be predicted given the previous data points. In the asymptotic case, the bounds are achieved if and only if the prediction error is white and uniformly distributed.
This page was built for publication: Generic Bounds on the Maximum Deviations in Sequential Prediction: An Information-Theoretic Analysis
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6327240)