Many problems in data science and machine learning suffer from what is known as the curse of dimensionality: as data becomes higher and higher dimensional, it gets harder and harder to learn from it. One way of combating this curse is by assuming that, even when data lives in some high-dimensional space, there exists some low-dimensional structure that can be exploited to make learning easier. Previous work has assumed that in many high-dimensional problems, the data actually lies on some low-dimensional submanifold. A complementary approach is to assume that these high-dimensional problems exhibit some hidden algebraic structure that can be exploited to make learning more tractable. In this talk, we will see two examples of problems where algebraic structure can be used to combat the curse of dimensionality: invariant polynomial regression and kernel density estimation on algebraic varieties.