A Second-Order Projected Primal-Dual Dynamical System for Distributed Optimization and Learning. Academic Article uri icon

abstract

  • This article focuses on developing distributed optimization strategies for a class of machine learning problems over a directed network of computing agents. In these problems, the global objective function is an addition function, which is composed of local objective functions. Such local objective functions are convex and only endowed by the corresponding computing agent. A second-order Nesterov accelerated dynamical system with time-varying damping coefficient is developed to address such problems. To effectively deal with the constraints in the problems, the projected primal-dual method is carried out in the Nesterov accelerated system. By means of the cocoercive maximal monotone operator, it is shown that the trajectories of the Nesterov accelerated dynamical system can reach consensus at the optimal solution, provided that the damping coefficient and gains meet technical conditions. In the end, the validation of the theoretical results is demonstrated by the email classification problem and the logistic regression problem in machine learning.

published proceedings

  • IEEE Trans Neural Netw Learn Syst

altmetric score

  • 0.25

author list (cited authors)

  • Wang, X., Yang, S., Guo, Z., & Huang, T.

citation count

  • 5

complete list of authors

  • Wang, Xiaoxuan||Yang, Shaofu||Guo, Zhenyuan||Huang, Tingwen

publication date

  • September 2023