Stochastic Optimization under Hidden Convexity

Niao He, ETH Zurich
-
CSE2 G01 (Gates Center)

 In this talk, we address a broad class of nonconvex stochastic optimization problems that exhibit a hidden convexity structure, i.e., those admitting a convex reformulation through some non-linear transformation. Such problems are prevalent in various data-driven applications, including optimal control, revenue and inventory management, and convex reinforcement learning. However, solving the convex reformulation may be computationally impractical. Instead, we focus on addressing the original hidden convex problems directly using stochastic first-order methods. We first examine the simplest projected stochastic (sub-)gradient methods and present the sample complexity guarantees for achieving global convergence in both smooth and non-smooth settings. We further introduce new algorithms with improved sample complexity for special problem settings and demonstrate their efficiency in practice.