Statistical Physics of Generative Diffusion

Generative models, in which one trains an algorithm to generate samples ‘similar’ to those of a data base, is a major new direction developed in machine learning in the recent years. In particular, generative models based on diffusion equations have become the state of the art, notably for image generation. However, the reasons for this spectacular technological success are not well understood, and neither are its limitations. After an introduction to this topic, the talk will focus on the behavior of generative diffusion in the high-dimensional limit, where data are formed by a very large number of variables. Using methods from statistical physics, we explain the various dynamical regimes that occur during the generation.

Friday, 2nd February 2024, 14:30, Aula Magna