Back To Index Previous Article Next Article Full Text

Statistica Sinica 35 (2025), 693-712

DIFFERENTIALLY PRIVATE REGULARIZED STOCHASTIC
CONVEX OPTIMIZATION WITH HEAVY-TAILED DATA

Haihan Xie1, Matthew Pietrosanu1, Yi Liu1,
Wei Tu2, Bei Jiang1 and Linglong Kong*1

1University of Alberta and 2Queen's University

Abstract: Abstract: Existing privacy guarantees for convex optimization algorithms do not apply to heavy-tailed data with regularized estimation. This is a notable gap in the differential privacy (DP) literature, given the broad prevalence of non-Gaussian data and penalized optimization problems. In this work, we propose three (ϵ, δ)-DP methods for regularized convex optimization and derive bounds on their population excess risks in a framework that accommodates heavy-tailed data with fewer assumptions (relative to previous works). This work is the first to address DP in generic regularized convex optimization problems with heavy-tailed responses. Two of our methods augment a basic (ϵ, δ)-DP algorithm with robust procedures for privately estimating minibatch gradients. Our numerical analyses highlight the performance of our methods relative to data dimensionality, batch size, and privacy budget, and suggest settings where each approach is favorable.

Key words and phrases: Error bound, non-smooth regularization, privacy protection, randomized mechanism.

Back To Index Previous Article Next Article Full Text