– Europe/Lisbon
Room P3.10, Mathematics Building — Online
Fenchel-Young Variational Learning II
This lecture first provides an introduction to classical variational inference (VI), a key technique for approximating complex posterior distributions in Bayesian methods, typically by minimizing the Kullback-Leibler (KL) divergence. We'll discuss its principles and common uses.
Building on this, the lecture introduces Fenchel-Young variational inference (FYVI), a novel generalization that enhances flexibility.FYVI replaces the KL divergence with broader Fenchel-Young (FY) regularizers, with a special focus on those derived from Tsallisentropies. This approach enables learning posterior distributions with significantly smaller, or sparser, support than the prior, offering advantages in model interpretability and performance.
S. Sklavidis, S. Agrawal, A. Farinhas, A. Martins and M. Figueiredo, Fenchel-Young Variational Learning,
https://arxiv.org/pdf/2502.10295