1-2-3 Seminar: Representation Stability for Machine Learning

Jack Kendrick, University of Washington
-
PDL C-401

Symmetry arises often when learning from high dimensional data. For example, data sets consisting of point clouds, graphs, and unordered sets appear routinely in contemporary applications and exhibit rich underlying symmetries. Moreover, many functions which we hope to learn from these data sets are well-defined regardless of the ambient dimension of the data. In this talk, we will explore how the phenomenon of representation stability can be exploited to learn invariant functions that generalize well as we vary the underlying dimension of data. In particular, we will discuss three examples of machine learning models: equivariant neural networks, invariant polynomial regression, and invariant kernels.

Zoom Link: https://washington.zoom.us/j/92849568892

Event Type
Event Subcalendar