Wasserstein variational inference
Red Hook, NY : Curran
InBengio, S.; Wallach, H.M.; Larochelle, H. (ed.), Proceedings of the 32nd International Conference on Neural Information Processing Systems, pp. 2478-2487
NIPS'18: 32nd International Conference on Neural Information Processing Systems (Montreal, Canada, 3-8 December2018)
Article in monograph or in proceedings
Display more detailsDisplay less details
SW OZ DCC SMN
SW OZ DCC AI
SW OZ DCC CO
Bengio, S.; Wallach, H.M.; Larochelle, H. (ed.), Proceedings of the 32nd International Conference on Neural Information Processing Systems
SubjectAction, intention, and motor control; Cognitive artificial intelligence; DI-BCB_DCC_Theme 2: Perception, Action and Control; DI-BCB_DCC_Theme 4: Brain Networks and Neuronal Communication
This paper introduces Wasserstein variational inference, a new form of approximate Bayesian inference based on optimal transport theory. Wasserstein variational inference uses a new family of divergences that includes both f-divergences and the Wasserstein distance as special cases. The gradients of the Wasserstein variational loss are obtained by backpropagating through the Sinkhorn iterations. This technique results in a very stable likelihood-free training method that can be used with implicit distributions and probabilistic programs. Using the Wasserstein variational inference framework, we introduce several new forms of autoencoders and test their robustness and performance against existing variational autoencoding techniques.
Upload full text
Use your RU credentials (u/z-number and password) to log in with SURFconext to upload a file for processing by the repository team.