Decentralized Stochastic Non-Convex Optimization over Weakly Connected Time-Varying Digraphs
In this paper, we consider decentralized stochastic non-convex optimization over a class of weakly connected digraphs. First, we quantify the convergence behaviors of the weight matrices of this type of digraphs. By leveraging the perturbed push sum protocol and gradient tracking techniques, we propose a decentralized stochastic algorithm that is able to converge to the first-order stationary points of non-convex problems with provable convergence rates. Our digraph structure considered in this work generalizes the existing settings such that the proposed algorithm can be applied to more practical decentralized learning scenarios. Numerical results showcase the strengths of our theory and superiority of the proposed algorithm in decentralized training problems compared with the existing counterparts.