A two session mini course on 29/5/2026, 10:00-10:50 and 11:10-12:00 at P3.10@Técnico and online with

Diffusion probabilistic models have become the state-of-the-art tool in generative methods, used to generate high-resolution samples from very high-dimension distributions (e.g. images). Although very effective, they suffer some drawbacks:
In this talk, we first provide an introduction to generative modeling, with a focus on diffusion models from the point of view of stochastic PDEs. Then, we introduce a kernel-smoothed empirical score and study the bias-variance of this estimator. We find improved bounds on the KL-divergence between a true measure and an approximate measure generated by using the smoothed empirical score. This score estimator leads to less memorization and better generalization. We demonstrate these findings on synthetic and real datasets, combining diffusion models with variational encoders to reduce the dimensionality of the problem.
Permanent link to this information: https://m4ai.math.tecnico.ulisboa.pt/lecture_series?sgid=116
Move the mouse over the schedule to see start and end times.