VisionJune 19, 2020UC Berkeley

Denoising Diffusion Probabilistic Models

Jonathan Ho, Ajay Jain, Pieter Abbeel

Abstract

We present high quality image synthesis results using diffusion probabilistic models, a class of latent variable models inspired by considerations from nonequilibrium thermodynamics. Our models produce samples that are competitive with state-of-the-art GANs while enjoying desirable properties such as distribution coverage and a stationary training objective.

Key Findings

  • 1Demonstrated that diffusion models can generate high-quality images competitive with GANs
  • 2Introduced a simplified training objective based on denoising score matching
  • 3Showed better mode coverage and training stability compared to GANs
  • 4Established the theoretical framework for modern diffusion-based generation
  • 5Achieved state-of-the-art FID scores on image generation benchmarks

Impact & Significance

DDPM revived interest in diffusion models and directly led to the development of Stable Diffusion, DALL-E 2, Imagen, and the entire class of modern AI image generators. It replaced GANs as the dominant paradigm for image synthesis.

Read Full Paper