Bilipschitz embeddings for invariant machine learning 

Dustin Mixon, OSU
-
Gates Commons CSE1 691
Machine learning algorithms are typically designed for Euclidean data, but many natural datasets come with symmetries: a group G of isometries acts on a Euclidean space V, and points in the same orbit represent the same object. That means the true data space is not V, but the orbit space V/G. Invariant machine learning represents this quotient by a G-invariant feature map into Euclidean space. For robustness, especially against adversarial examples, this feature map should be bilipschitz with respect to the quotient metric. Sadly, vanilla polynomial invariants fail to be bilipschitz, so we need to move beyond classical invariant theory. In this talk, we present low-distortion embeddings in a variety of settings, and we conclude with several open problems.