Skip to yearly menu bar Skip to main content


Poster

Score identity Distillation: Exponentially Fast Distillation of Pretrained Diffusion Models for One-Step Generation

Mingyuan Zhou · Huangjie Zheng · Zhendong Wang · Mingzhang Yin · Hai Huang


Abstract:

We unveil Score identity Distillation (SiD), a novel approach that distills the generative prowess of pretrained diffusion models into a single-step generator, achieving an exponentially fast reduction in Fréchet inception distance (FID) during distillation. By reformulating forward diffusion processes as semi-implicit distributions, we leverage three score-related identities to create an innovative loss mechanism. This mechanism enables swift FID reduction without necessitating real data or reverse-diffusion-based generation, all accomplished within significantly reduced generation time. Upon evaluation across four benchmark datasets, the SiD algorithm distinctly positions itself as a top choice among various deep generative methods. It consistently achieves low FIDs, approaching or even surpassing those of the teacher diffusion models. Moreover, it achieves high iteration efficiency during distillation and surpasses rival one-step generation methods in generation quality, thereby redefining the standards for efficiency and effectiveness in diffusion distillation.

Live content is unavailable. Log in and register to view live content