Minimax Rates of Estimating Approximate Differential Privacy

From MaRDI portal
Publication:6319325

arXiv1905.10335MaRDI QIDQ6319325

Sewoong Oh, Xiyang Liu

Publication date: 24 May 2019

Abstract: Differential privacy has become a widely accepted notion of privacy, leading to the introduction and deployment of numerous privatization mechanisms. However, ensuring the privacy guarantee is an error-prone process, both in designing mechanisms and in implementing those mechanisms. Both types of errors will be greatly reduced, if we have a data-driven approach to verify privacy guarantees, from a black-box access to a mechanism. We pose it as a property estimation problem, and study the fundamental trade-offs involved in the accuracy in estimated privacy guarantees and the number of samples required. We introduce a novel estimator that uses polynomial approximation of a carefully chosen degree to optimally trade-off bias and variance. With $n$ samples, we show that this estimator achieves performance of a straightforward plug-in estimator with $n ln n$ samples, a phenomenon referred to as effective sample size amplification. The minimax optimality of the proposed estimator is proved by comparing it to a matching fundamental lower bound.




Has companion code repository: https://github.com/OpenMined/PyDPValidator







This page was built for publication: Minimax Rates of Estimating Approximate Differential Privacy