Improved Rates for Differentially Private Stochastic Convex Optimization with Heavy-Tailed Data
From MaRDI portal
Publication:6369274
arXiv2106.01336MaRDI QIDQ6369274
Gautam Kamath, Huanyu Zhang, Xingtu Liu
Publication date: 2 June 2021
Abstract: We study stochastic convex optimization with heavy-tailed data under the constraint of differential privacy (DP). Most prior work on this problem is restricted to the case where the loss function is Lipschitz. Instead, as introduced by Wang, Xiao, Devadas, and Xu cite{WangXDX20}, we study general convex loss functions with the assumption that the distribution of gradients has bounded -th moments. We provide improved upper bounds on the excess population risk under concentrated DP for convex and strongly convex loss functions. Along the way, we derive new algorithms for private mean estimation of heavy-tailed distributions, under both pure and concentrated DP. Finally, we prove nearly-matching lower bounds for private stochastic convex optimization with strongly convex losses and mean estimation, showing new separations between pure and concentrated DP.
This page was built for publication: Improved Rates for Differentially Private Stochastic Convex Optimization with Heavy-Tailed Data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6369274)