PCPs and the Hardness of Generating Private Synthetic Data
From MaRDI portal
Publication:3000552
DOI10.1007/978-3-642-19571-6_24zbMath1295.94190OpenAlexW1587575659MaRDI QIDQ3000552
Jonathan R. Ullman, Salil P. Vadhan
Publication date: 19 May 2011
Published in: Theory of Cryptography (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-642-19571-6_24
privacyconstraint satisfaction problemsprobabilistically checkable proofsinapproximabilitydigital signatures
Related Items (13)
Separating Computational and Statistical Differential Privacy in the Client-Server Model ⋮ Strong Hardness of Privacy from Weak Traitor Tracing ⋮ Fingerprinting Codes and the Price of Approximate Differential Privacy ⋮ Private Sampling: A Noiseless Approach for Generating Differentially Private Synthetic Data ⋮ Covariance's loss is privacy's gain: computationally efficient, private and accurate synthetic data ⋮ Differential Privacy on Finite Computers ⋮ Order-Revealing Encryption and the Hardness of Private Learning ⋮ Answering $n^2+o(1)$ Counting Queries with Differential Privacy is Hard ⋮ Segmentation, Incentives, and Privacy ⋮ What Can We Learn Privately? ⋮ Unnamed Item ⋮ The Complexity of Differential Privacy ⋮ Efficient algorithms for privately releasing marginals via convex relaxations
This page was built for publication: PCPs and the Hardness of Generating Private Synthetic Data