You are here

Using the linear geometry of ReLU neural networks to detect out-of-distribution inputs

Grayson Jorgenson, Pacific Northwest National Lab
Thursday, October 28, 2021 - 1:00pm to 2:00pm
PDL C-401 and online
Grayson Jorgenson picture

Abstract: Many modern neural networks rely solely on ReLU activation functions for nonlinearity, motivated by factors including both the computational simplicity of ReLUs and their role in helping prevent vanishing gradients. These networks are consequently piecewise linear functions and this opens the door to studying their behavior using combinatorial and geometric techniques without direct analog for networks employing smooth nonlinearities such as sigmoid activations. In this talk, we will discuss metrics that take advantage of this linear geometry and can be used to classify in and out-of-distribution inputs for ReLU networks.

TAG-DS is a hybrid seminar and will be available in-person at the UW Mathematics Department as well as online on Zoom. You can find the link to the zoom meeting here. If you would like to be added to our mailing list, you can do so by visiting this page.

Share