There is a common belief that the ADMM, a popular algorithm employed for distributed convex optimization over graphs, is faster than another distributed algorithm typically referred as the Average Consensus. This belief is based on the observation that the ratio of the number of iterations necessary to achieve a desired error with respect to the optimal solution of the ADMM vs the Average Consensus goes to zero as the graph becomes larger or less connected. In this work, we provide a closed form expression for the rate of ADMM as a function of the essential spectral radius of the graph, which is a measure of connectivity of the graph, for scalar quadratic cost functions with identical curvature, and we show that its rate of convergence can be slower than the Average Consensus when the graph is highly connected. Moreover, via extensive simulations, we show that ADMM performance, differently from the average consensus, rapidly degrade as the cost functions become skewed, thus making the latter approach competitive also for sparse graphs.
© 2001-2024 Fundación Dialnet · Todos los derechos reservados