Seminars

Planned seminars

Europe/Lisbon
Room P3.10, Mathematics Building Instituto Superior Técnicohttps://tecnico.ulisboa.pt

Mário Figueiredo
, IT & Instituto Superior Técnico

This lecture first provides an introduction to classical variational inference (VI), a key technique for approximating complex posterior distributions in Bayesian methods, typically by minimizing the Kullback-Leibler (KL) divergence. We'll discuss its principles and common uses.

Building on this, the lecture introduces Fenchel-Young variational inference (FYVI), a novel generalization that enhances flexibility. FYVI replaces the KL divergence with broader Fenchel-Young (FY) regularizers, with a special focus on those derived from Tsallis entropies. This approach enables learning posterior distributions with significantly smaller, or sparser, support than the prior, offering advantages in model interpretability and performance.

Europe/Lisbon
Room P3.10, Mathematics Building Instituto Superior Técnicohttps://tecnico.ulisboa.pt

, IT & Instituto Superior Técnico

This lecture first provides an introduction to classical variational inference (VI), a key technique for approximating complex posterior distributions in Bayesian methods, typically by minimizing the Kullback-Leibler (KL) divergence. We'll discuss its principles and common uses.

Building on this, the lecture introduces Fenchel-Young variational inference (FYVI), a novel generalization that enhances flexibility.FYVI replaces the KL divergence with broader Fenchel-Young (FY) regularizers, with a special focus on those derived from Tsallisentropies. This approach enables learning posterior distributions with significantly smaller, or sparser, support than the prior, offering advantages in model interpretability and performance.

Europe/Lisbon
Room P3.10, Mathematics Building Instituto Superior Técnicohttps://tecnico.ulisboa.pt

Paulo Mourão
, Sapienza University of Rome

The Hopfield Neural Network has played, ever since its introduction in 1982 by John Hopfield, a fundamental role in the inter-disciplinary study of storage and retrieval capabilities of neural networks, further highlighted by the recent 2024 Physics Nobel Prize.

From its strong link with biological pattern retrieval mechanisms to its high-capacity Dense Associative Memory variants and connections to generative models, the Hopfield Neural Network has found relevance both in Neuroscience, as well as the most modern of AI systems.

Much of our theoretical knowledge of these systems however, comes from a surprising and powerful link with Statistical Mechanics, first established and explored in seminal works of Amit, Gutfreund and Sompolinsky in the second half of the 1980s: the interpretation of associative memories as spin-glass systems.

In this talk, we will present this duality, as well as the mathematical techniques from spin-glass systems that allow us to accurately and rigorously predict the behavior of different types of associative memories, capable of undertaking various different tasks.

Europe/Lisbon
Room P3.10, Mathematics Building Instituto Superior Técnicohttps://tecnico.ulisboa.pt

, Sapienza University of Rome

The Hopfield Neural Network has played, ever since its introduction in 1982 by John Hopfield, a fundamental role in the inter-disciplinary study of storage and retrieval capabilities of neural networks, further highlighted by the recent 2024 Physics Nobel Prize.

From its strong link with biological pattern retrieval mechanisms to its high-capacity Dense Associative Memory variants and connections to generative models, the Hopfield Neural Network has found relevance both in Neuroscience, as well as the most modern of AI systems.

Much of our theoretical knowledge of these systems however, comes from a surprising and powerful link with Statistical Mechanics, first established and explored in seminal works of Amit, Gutfreund and Sompolinsky in the second half of the 1980s: the interpretation of associative memories as spin-glass systems.

In this talk, we will present this duality, as well as the mathematical techniques from spin-glass systems that allow us to accurately and rigorously predict the behavior of different types of associative memories, capable of undertaking various different tasks.