Abstract: Exploiting symmetries in the learning problem has been a guiding principle for designing efficient neural network architectures on new domains, like images, sets, graphs, and point clouds. While these symmetries such as translation, rotation, and permutation require only a few bits of prior knowledge to specify, restricting to equivariant models vastly reduces the search space and improves generalization. Using tools from representation theory, we provide a general algorithm for computing the space of equivariant layers for matrix groups. In addition to recovering well known solutions from prior work, we can construct equivariant networks to new groups and representations. Building on these results, we outline a path towards automating model construction more generally.
TAG-DS is a hybrid seminar and will be available in-person at the UW Mathematics Department as well as online on Zoom. You can find the link to the zoom meeting here. If you would like to be added to our mailing list, you can do so by visiting this page.