MAHI 2013 workshop: Methodological Aspects of Hyperspectral Imaging


title Distributed Optimization
Richard Heusdens, Delft University of Technology, (NL)

In this talk we will focus on distributed optimization algorithms. We will discuss optimization methods based on inference in graphical models, like the mini-sum and related algorithms, and methods based on convex optimization, like the alternating direction of multipliers method (ADMM) and variations of it. With respect to inference-based algorithm, we will focus on the (generalized) linear-coordinate descent algorithm, an iterative optimization algorithm with a convergence rate comparable to that of the min-sum algorithm, but with significantly less parameters to transmit per iteration. With respect to convex optimization based algorithms, we will focus on ADMM, a simple but powerful algorithm that is well suited to distributed convex optimization, and the bi-alternating direction method of multipliers (BiADMM), an algorithm that iteratively minimizes an augmented bi-conjugate function. The convergence of BiADMM is naturally established. Unlike ADMM that always involves three updates per iteration, BiADMM only needs two coordinate-descent operations.