Meetings

lecture series

A two session mini course on 29/5/2026, 10:00-10:50 and 11:10-12:00 at P3.10@Técnico and online with

Kernel-smoothed score for diffusion probabilistic models
Maria Han Veiga, Ohio State University

Photo

Diffusion probabilistic models have become the state-of-the-art tool in generative methods, used to generate high-resolution samples from very high-dimension distributions (e.g. images). Although very effective, they suffer some drawbacks:

  1. as opposed to variational encoders, the dimension of the problem remains high during the generation process and
  2. they can be prone to memorization of the training dataset.

In this talk, we first provide an introduction to generative modeling, with a focus on diffusion models from the point of view of stochastic PDEs. Then, we introduce a kernel-smoothed empirical score and study the bias-variance of this estimator. We find improved bounds on the KL-divergence between a true measure and an approximate measure generated by using the smoothed empirical score. This score estimator leads to less memorization and better generalization. We demonstrate these findings on synthetic and real datasets, combining diffusion models with variational encoders to reduce the dimensionality of the problem.

Permanent link to this information: https://m4ai.math.tecnico.ulisboa.pt/lecture_series?sgid=116