Convergence of Deterministic and Stochastic Diffusion-Model Samplers: A Simple Analysis in Wasserstein Distance

arXiv — stat.MLFriday, November 14, 2025 at 5:00:00 AM
The article presents new convergence guarantees in Wasserstein distance for diffusion-based generative models, focusing on both stochastic (DDPM-like) and deterministic (DDIM-like) sampling methods. A simple framework is introduced to analyze errors related to discretization, initialization, and score estimation. The authors derive the first Wasserstein convergence bound for the Heun sampler and enhance existing results for the Euler sampler associated with the probability flow ODE. The analysis highlights the significance of the spatial regularity of the learned score function and advocates for controlling score errors in relation to the true reverse process, aligning with denoising score matching principles. Additionally, recent findings on smoothed Wasserstein distances are incorporated to refine initialization error bounds.
— via World Pulse Now AI Editorial System

Was this article worth reading? Share it