Efficient Private SCO for Heavy-Tailed Data via Clipping
From MaRDI portal
Publication:6403203
arXiv2206.13011MaRDI QIDQ6403203
Author name not available (Why is that?)
Publication date: 26 June 2022
Abstract: We consider stochastic convex optimization for heavy-tailed data with the guarantee of being differentially private (DP). Prior work on this problem is restricted to the gradient descent (GD) method, which is inefficient for large-scale problems. In this paper, we resolve this issue and derive the first high-probability bounds for the private stochastic method with clipping. For general convex problems, we derive excess population risks and under bounded or unbounded domain assumption, respectively (here is the sample size, is the dimension of the data, is the confidence level and is the private level). Then, we extend our analysis to the strongly convex case and non-smooth case (which works for generalized smooth objectives with Hlder-continuous gradients). We establish new excess risk bounds without bounded domain assumption. The results above achieve lower excess risks and gradient complexities than existing methods in their corresponding cases. Numerical experiments are conducted to justify the theoretical improvement.
Has companion code repository: https://github.com/jchenhan/aclippingdp
This page was built for publication: Efficient Private SCO for Heavy-Tailed Data via Clipping
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6403203)