Differential privacy: getting more for less
From MaRDI portal
Publication:6198644
DOI10.4171/icm2022/196OpenAlexW4389775201MaRDI QIDQ6198644
Publication date: 20 March 2024
Published in: International Congress of Mathematicians (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.4171/icm2022/196
machine learningdifferential privacyprivate data analysisprivate machine learningprivacy-preserving data analysis
Computational learning theory (68Q32) Learning and adaptive systems in artificial intelligence (68T05) Randomized algorithms (68W20) Privacy of data (68P27)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bounds on the sample complexity for private learning and private data release
- Distributed differential privacy via shuffling
- The Complexity of Computing the Optimal Composition of Differential Privacy
- Randomized Response: A Survey Technique for Eliminating Evasive Answer Bias
- Rényi Divergence and Kullback-Leibler Divergence
- What Can We Learn Privately?
- Concentrated Differential Privacy: Simplifications, Extensions, and Lower Bounds
- The Algorithmic Foundations of Differential Privacy
- Our Data, Ourselves: Privacy Via Distributed Noise Generation
- Differential privacy and robust statistics
- Composable and versatile privacy via truncated CDP
- Amplification by Shuffling: From Local to Central Differential Privacy via Anonymity
- Theory of Cryptography
- Differential Privacy
This page was built for publication: Differential privacy: getting more for less