Loading [MathJax]/jax/output/CommonHTML/jax.js

Fast Convergence of Langevin Dynamics on Manifold: Geodesics meet Log-Sobolev

Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020)

AuthorFeedback Bibtex MetaReview Paper Review Supplemental

Authors

Xiao Wang, Qi Lei, Ioannis Panageas

Abstract

Sampling is a fundamental and arguably very important task with numerous applications in Machine Learning. One approach to sample from a high dimensional distribution ef for some function f is the Langevin Algorithm (LA). Recently, there has been a lot of progress in showing fast convergence of LA even in cases where f is non-convex, notably \cite{VW19}, \cite{MoritaRisteski} in which the former paper focuses on functions f defined in Rn and the latter paper focuses on functions with symmetries (like matrix completion type objectives) with manifold structure. Our work generalizes the results of \cite{VW19} where f is defined on a manifold M rather than Rn. From technical point of view, we show that KL decreases in a geometric rate whenever the distribution ef satisfies a log-Sobolev inequality on M.