Skip to yearly menu bar Skip to main content


Poster

Accelerating Parallel Sampling of Diffusion Models

Zhiwei Tang · Jiasheng Tang · Hao Luo · Fan Wang · Tsung-Hui Chang


Abstract: Diffusion models have emerged as state-of-the-artgenerative models for image generation. However, sampling from diffusion models is usuallytime-consuming due to the inherent autoregressive nature of their sampling process. In thiswork, we propose a novel approach that accelerates the sampling of diffusion models by parallelizing the autoregressive process. Specifically,we reformulate the sampling process as solving asystem of triangular nonlinear equations throughfixed-point iteration. With this innovative formulation, we explore several systematic techniquesto further reduce the iteration steps required bythe solving process. Applying these techniques,we introduce ParaTAA, a universal and training-free parallel sampling algorithm that can leverage extra computational and memory resourcesto increase the sampling speed. Our experimentsdemonstrate that ParaTAA can decrease the inference steps required by common sequential sampling algorithms such as DDIM and DDPM bya factor of 4$\sim$14 times. Notably, when applyingParaTAA with 100 steps DDIM for Stable Diffusion, a widely-used text-to-image diffusion model,it can produce the same images as the sequentialsampling in only 7 inference steps.

Live content is unavailable. Log in and register to view live content