## Frontmatter
| | |
| --- | --- |
| Authors | [[Luca Ambrogioni]], [[Umut Güçlü]], [[Yağmur Güçlütürk]], [[Max Hinne]], [[Marcel van Gerven]], [[Eric Maris]] |
| Date | 2018/12 |
| Source | [[Conference on Neural Information Processing Systems]] |
| URL | http://papers.nips.cc/paper/by-source-2018-1244 |
| Citation | Ambrogioni, L., Güçlü, U., Güçlütürk, Y., Hinne, M., van Gerven, M., & Maris, E. (2018). [[Wasserstein variational inference]]. In _Conference on Neural Information Processing Systems_. [[URL](http://papers.nips.cc/paper/by-source-2018-1244)]. #Conference |
## Abstract
This paper introduces Wasserstein variational inference, a new form of approximate Bayesian inference based on optimal transport theory. Wasserstein variational inference uses a new family of divergences that includes both f-divergences and the Wasserstein distance as special cases. The gradients of the Wasserstein variational loss are obtained by backpropagating through the Sinkhorn iterations. This technique results in a very stable likelihood-free training method that can be used with implicit distributions and probabilistic programs. Using the Wasserstein variational inference framework, we introduce several new forms of autoencoders and test their robustness and performance against existing variational autoencoding techniques.
## PDF
![[Wasserstein variational inference.pdf]]