Hasty Briefsbeta

Bilingual

Rethinking Losses for Diffusion Bridge Samplers

a year ago
  • #Optimization
  • #Machine Learning
  • #Diffusion Bridges
  • Diffusion bridges are deep-learning methods for sampling from unnormalized distributions.
  • Log Variance (LV) loss outperforms reverse Kullback-Leibler (rKL) loss when using the reparametrization trick.
  • For diffusion bridges or learned diffusion coefficients, LV loss does not maintain equivalence with rKL loss.
  • rKL loss with the log-derivative trick (rKL-LD) avoids conceptual problems and outperforms LV loss.
  • Experimental results show rKL-LD loss leads to better performance in diffusion bridges.
  • rKL-LD requires less hyperparameter optimization and offers more stable training.