Differentially private SGD with random features
From MaRDI portal
Publication:6542573
DOI10.1007/s11766-024-5037-0MaRDI QIDQ6542573
Publication date: 22 May 2024
Published in: Applied Mathematics. Series B (English Edition) (Search for Journal in Brave)
learning theoryreproducing kernel Hilbert spacesstochastic gradient descentdifferential privacyrandom features
Applications of mathematical programming (90C90) Learning and adaptive systems in artificial intelligence (68T05) Privacy of data (68P27)
Cites Work
- Unnamed Item
- Unnamed Item
- Nonparametric stochastic approximation with large step-sizes
- Online gradient descent learning algorithms
- Optimum bounds for the distributions of martingales in Banach spaces
- Online gradient descent algorithms for functional data learning
- Fast and strong convergence of online learning algorithms
- Learning theory estimates via integral operators and their approximations
- Differentially private SGD with non-smooth losses
- The Algorithmic Foundations of Differential Privacy
- Learning Theory
- Online Regularized Classification Algorithms
- High-Dimensional Statistics
- Optimal Rates for Multi-pass Stochastic Gradient Methods
- Private stochastic convex optimization: optimal rates in linear time
- Theory of Reproducing Kernels
- Theory of Cryptography
- Capacity dependent analysis for functional online learning algorithms
This page was built for publication: Differentially private SGD with random features