– Europe/Lisbon
Room P3.10, Mathematics Building
— Online
Kernel-smoothed score for diffusion probabilistic models I
Diffusion probabilistic models have become the state-of-the-art tool in generative methods, used to generate high-resolution samples from very high-dimension distributions (e.g. images). Although very effective, they suffer some drawbacks:
- as opposed to variational encoders, the dimension of the problem remains high during the generation process and
- they can be prone to memorization of the training dataset.
In this talk, we first provide an introduction to generative modeling, with a focus on diffusion models from the point of view of stochastic PDEs. Then, we introduce a kernel-smoothed empirical score and study the bias-variance of this estimator. We find improved bounds on the KL-divergence between a true measure and an approximate measure generated by using the smoothed empirical score. This score estimator leads to less memorization and better generalization. We demonstrate these findings on synthetic and real datasets, combining diffusion models with variational encoders to reduce the dimensionality of the problem.