You are here

Polynomial Decompositions in Machine Learning

Joe Kileel, Princeton University
Tuesday, October 22, 2019 - 2:30pm to 3:30pm
PDL C-36

Symmetric tensors are multi-dimensional arrays invariant to permutation of indices; equivalently, they correspond one-to-one with homogeneous multivariate polynomials. In a variety of machine learning tasks, it is popular to decompose a symmetric tensor as a sum of symmetric outer products of vectors.

In this talk, we present a new algorithm for computing low-rank symmetric tensor decompositions. Numerical experiments demonstrate that the method outperforms or matches state-of-the-art methods, per standard performance metrics. We prove some supporting theoretical guarantees, through connections to algebraic geometry and dynamical systems.

Time permitting, we extend the method to compute a certain generalization of symmetric tensor decompositions. This enables estimation of a union of linear subspaces from very noisy point samples.

Joint work with Joao Pereira (Duke).

Event Type: