Skip to yearly menu bar Skip to main content


Poster

Denoising Score Matching For All

raghav singhal · Mark Goldstein · Rajesh Ranganath


Abstract:

Diffusion-based generative models (DBGMs) learn to reverse a noise process that transports the data distribution to the prior distribution. The noise processes that are tractable to work with only include linear processes with a Gaussian noise distribution. This limits the kinds of models that can be built to those that target Gaussian noise and more generally limits the kinds of problems that be solved to those that have conditionally linear score functions. In this work, we introduce a family of tractable denoising score matching (local-DSM) objective using local increments of the noise process. The local-DSM objectives are amenable to taylor expansions thereby enabling training DBGMs with non-linear noise processes. Some examples of the added flexibility include training with non-linear drifts in the noise process, as is commonly the case in applications to statistical physics, biology and finance. Another is transporting data to more flexible noise distributions, such as mixtures of Gaussians, Logistic, normalizing flows, acting as richer priors for the generative process. To demonstrate these ideas, local-DSM makes it feasible to train generative models using non-Gaussian priors on both challenging low dimensional distributions and an imaging dataset. Additionally, we use the local-DSM objective to learn the scores for non-linear processes studied in statistical physics.

Live content is unavailable. Log in and register to view live content