Relative Entropy with Connections to Optimal Transport

Garrett Mulcahy, University of Washington
-
PDL C-401

Motivated by the problem of entropic regularized optimal transport, this expository talk will start by introducing relative entropy (also known as Kullback-Liebler divergence, I-divergence), a notion of distance on the space of probability distributions. Although relative entropy does not define a metric on this space, in some instances it functions in analogy to squared Euclidean distance; for example, we can develop a notion of projection and even a Pythagorean theorem. We will follow a 1975 paper by Csiszar titled "I-Divergence Geometry of Probability Distributions and Minimization Problems" for these statements and more. This notion of projection leads to quantities of interest to optimal transport, a connection we will emphasize. If time permits, we will conclude by introducing an algorithm for obtaining these quantities called the iterative proportional fitting procedure (IPFP).