Planned seminars

Europe/Lisbon
Room P3.10, Mathematics Building Instituto Superior Técnicohttps://tecnico.ulisboa.pt

André Martins
, IT & Instituto Superior Técnico

$p$-adic machine learning I

Existing machine learning frameworks operate over the field of real numbers ($\mathbb{R}$ ) and learn representations in real (Euclidean or Hilbert) vector spaces (e.g., $\mathbb{R}^d$ ). Their underlying geometric properties align well with intuitive concepts such as linear separability, minimum enclosing balls, and subspace projection; and basic calculus provides a toolbox for learning through gradient-based optimization.

But is this the only possible choice? In this seminar, we study the suitability of a radically different field as an alternative to $\mathbb{R}$ — the ultrametric and non-archimedean space of $p$-adic numbers, $\mathbb{Q}_p$. The hierarchical structure of the $p$-adics and their interpretation as infinite strings make them an appealing tool for code theory and hierarchical representation learning. Our exploratory theoretical work establishes the building blocks for classification, regression, and representation learning with the $p$-adics, providing learning models and algorithms. We illustrate how simple Quillian semantic networks can be represented as a compact $p$-adic linear network, a construction which is not possible with the field of reals. We finish by discussing open problems and opportunities for future research enabled by this new framework.

Based on :
André F. T. Martins, Learning with the $p$-adics

Europe/Lisbon
Room P3.10, Mathematics Building Instituto Superior Técnicohttps://tecnico.ulisboa.pt

Paulo Mourão
, Sapienza University of Rome

The Statistical Mechanics of Associative Memories I

The Hopfield Neural Network has played, ever since its introduction in 1982 by John Hopfield, a fundamental role in the inter-disciplinary study of storage and retrieval capabilities of neural networks, further highlighted by the recent 2024 Physics Nobel Prize.

From its strong link with biological pattern retrieval mechanisms to its high-capacity Dense Associative Memory variants and connections to generative models, the Hopfield Neural Network has found relevance both in Neuroscience, as well as the most modern of AI systems.

Much of our theoretical knowledge of these systems however, comes from a surprising and powerful link with Statistical Mechanics, first established and explored in seminal works of Amit, Gutfreund and Sompolinsky in the second half of the 1980s: the interpretation of associative memories as spin-glass systems.

In this talk, we will present this duality, as well as the mathematical techniques from spin-glass systems that allow us to accurately and rigorously predict the behavior of different types of associative memories, capable of undertaking various different tasks.