On the Convex Gaussian Minimax Theorem and Minimum-Norm Interpolation

Gil Kur (ETH)
-

We revisit sharp risk bounds for the minimum-l1-norm interpolant (basis pursuit) in high-dimensional linear regression. These bounds were first obtained by Wang, Donhauser and Yang (2022) via the Convex Gaus-
sian Minimax Theorem (CGMT), in an analysis motivated by the widely used Gaussian-width and uniform-convergence framework of Koehler et al. (2021); in particular, their results disprove a conjecture of Chinot, L ̈offler and van de Geer (2021). In contrast, our proof does not rely on Gaussian comparison inequalities or CGMT. Instead, it uses tools from high-dimensional geometry, superconcentration, and the geometry of Gaussian polytopes. We furthermore show that CGMT-based analyses of minimum-norm interpolation can be unexpectedly sensitive to very fine estimates of auxiliary Gaussian optimization problems, and we argue that CGMT may not always be the most appropriate tool for studying the mean-squared error of minimum-norm interpolants.

Event Subcalendar