Seminars

Europe/Lisbon
Room P3.10, Mathematics Building — Online

Mário Figueiredo
Mário Figueiredo, IT & Instituto Superior Técnico

Fenchel-Young Variational Learning I

This lecture first provides an introduction to classical variational inference (VI), a key technique for approximating complex posterior distributions in Bayesian methods, typically by minimizing the Kullback-Leibler (KL) divergence. We'll discuss its principles and common uses.

Building on this, the lecture introduces Fenchel-Young variational inference (FYVI), a novel generalization that enhances flexibility. FYVI replaces the KL divergence with broader Fenchel-Young (FY) regularizers, with a special focus on those derived from Tsallis entropies. This approach enables learning posterior distributions with significantly smaller, or sparser, support than the prior, offering advantages in model interpretability and performance.

Reference: S. Sklavidis, S. Agrawal, A. Farinhas, A. Martins and M. Figueiredo, Fenchel-Young Variational Learning,
https://arxiv.org/pdf/2502.10295