Skip to yearly menu bar Skip to main content


Poster

Minimizing $f$-Divergences by Interpolating Velocity Fields

Song Liu · Jiahao Yu · Jack Simons · Mingxuan Yi · Mark Beaumont


Abstract: Many machine learning problems can be formulated as approximating a target distribution using a particle distribution by minimizing a statistical discrepancy. Wasserstein Gradient Flow can be employed to move particles along a path that minimizes the $f$-divergence between the \textit{target} and \textit{particle} distributions. To perform such movements we need to calculate the cprresponding velocity fields which include a density ratio function between these two distributions. While previous works estimated the density ratio function first and then differentiated the estimated ratio, this approach may suffer from overfitting, which leads to a less accurate estimate. Inspired by non-parametric curve fitting, we directly estimate these velocity fields using interpolation. We prove that our method is asymptotically consistent under mild conditions. We validate the effectiveness using novel applications on domain adaptation and missing data imputation.

Live content is unavailable. Log in and register to view live content