Abstract: We consider minimization of stochastic functionals that are compositions of a (potentially) non-smooth convex function h and smooth function c. We develop two stochastic methods--a stochastic prox-linear algorithm and a stochastic (generalized) sub-gradient procedure--and prove that, under mild technical conditions, each converges to first-order stationary points of the stochastic objective. Additionally, we analyze this problem class in the context of phase retrieval and more generic nonlinear modeling problems, showing that we can solve these problems (even with faulty measurements) with extremely high probability under appropriate random measurement models. We provide substantial experiments investigating our methods, indicating the practical effectiveness of the procedures.
Bio: John Duchi is an assistant professor of Statistics and Electrical Engineering and (by courtesy) Computer Science at Stanford University, with graduate degrees from UC Berkeley and undergraduate degrees from Stanford. His work focuses on large scale optimization problems arising out of statistical and machine learning problems, robustness and uncertain data problems, and information theoretic aspects of statistical learning. He has won a number of awards and fellowships, including a best paper award at the International Conference on Machine Learning, an NSF CAREER award, and a Sloan Fellowship in Mathematics.
*Joint with Statistics seminar.