An information upper bound for probability sensitivity

From MaRDI portal
Publication:6401199

arXiv2206.02274MaRDI QIDQ6401199

Jiannan Yang

Publication date: 5 June 2022

Abstract: Uncertain input of a mathematical model induces uncertainties in the output and probabilistic sensitivity analysis identifies the influential inputs to guide decision-making. Of practical concern is the probability that the output would, or would not, exceed a threshold, and the probability sensitivity depends on this threshold which is often uncertain. The Fisher information and the Kullback-Leibler divergence have been recently proposed in the literature as threshold-independent sensitivity metrics. We present mathematical proof that the information-theoretical metrics provide an upper bound for the probability sensitivity. The proof is elementary, relying only on a special version of the Cauchy-Schwarz inequality called Titu's lemma. Despite various inequalities exist for probabilities, little is known of probability sensitivity bounds and the one proposed here is new to the present authors' knowledge. The probability sensitivity bound is extended, analytically and with numerical examples, to the Fisher information of both the input and output. It thus provides a solid mathematical basis for decision-making based on probabilistic sensitivity metrics.




Has companion code repository: https://github.com/longitude-jyang/ProbSensitivityInfoBound








This page was built for publication: An information upper bound for probability sensitivity

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6401199)