You are here

CORE Series: Catalyst, Generic Acceleration Scheme for Gradient-based Optimization

Zaid Harchaoui, Department of Statistics, University of Washington
Tuesday, April 18, 2017 - 4:00pm to 5:00pm
EEB 125

We introduce a generic scheme called Catalyst for accelerating first-order optimization methods in the sense of Nesterov, which builds upon a new analysis of the accelerated proximal point algorithm. The proposed approach consists of minimizing a convex objective by approximately solving a sequence of well-chosen auxiliary problems, leading to faster convergence. This strategy applies to a large class of algorithms, including gradient descent, block coordinate descent, SAG, SAGA, SDCA, SVRG, Finito/MISO, and their proximal variants. For all of these methods, we provide acceleration and explicit support for non-strongly convex objectives. Furthermore, the approach can be extended to venture into possibly nonconvex optimization problems without sacrificing the rate of convergence to stationary points. We present experimental results showing that the Catalyst acceleration scheme is effective in practice, especially for ill-conditioned problems where we measure significant improvements.

Share