Privacy Masking Stochastic Subgradient-Push Algorithm for Distributed Online Optimization Academic Article uri icon

abstract

  • This article investigates the problem of distributed online optimization for a group of units communicating on time-varying unbalanced directed networks. The main target of the set of units is to cooperatively minimize the sum of all locally known convex cost functions (global cost function) while pursuing the privacy of their local cost functions being well masked. To address such optimization problems in a collaborative and distributed fashion, a differentially private-distributed stochastic subgradient-push algorithm, called DP-DSSP, is proposed, which ensures that units interact with in-neighbors and collectively optimize the global cost function. Unlike most of the existing distributed algorithms which do not consider privacy issues, DP-DSSP via differential privacy strategy successfully masks the privacy of participating units, which is more practical in applications involving sensitive messages, such as military affairs or medical treatment. An important feature of DP-DSSP is tackling distributed online optimization problems under the circumstance of time-varying unbalanced directed networks. Theoretical analysis indicates that DP-DSSP can effectively mask differential privacy as well as can achieve sublinear regrets. A compromise between the privacy levels and the accuracy of DP-DSSP is also revealed. Furthermore, DP-DSSP is capable of handling arbitrarily large but uniformly bounded delays in the communication links. Finally, simulation experiments confirm the practicability of DP-DSSP and the findings in this article.

author list (cited authors)

  • Lü, Q., Liao, X., Xiang, T., Li, H., & Huang, T.

publication date

  • January 1, 2021 11:11 AM