[Back]


Publications in Scientific Journals:

J. Backhoff, J. Fontbona, G. Rios, F. Tobar:
"Bayesian learning with Wasserstein barycenters";
arXiv.org, submitted May (2018), 32 pages.



English abstract:
We introduce a novel paradigm for Bayesian learning based on optimal transport theory. Namely, we propose to use the Wasserstein barycenter of the posterior law on models as a predictive posterior, thus introducing an alternative to classical choices like
the maximum a posteriori estimator and the Bayesian model average. We exhibit conditions granting the existence and statistical consistency of this estimator, discuss some of its basic and specific properties, and provide insight into its theoretical advantages. Finally,
we introduce a novel numerical method which is ideally suited for the computation of our estimator, and we explicitly discuss its implementations for specific families of models.
This method can be seen as a stochastic gradient descent algorithm in the Wasserstein space, and is of independent interest and applicability for the computation of Wasserstein barycenters. We also provide an illustrative numerical example for experimental validation
of the proposed method.

Keywords:
Bayesian learning non-parametric estimation Wasserstein distance Wasserstein barycenter Frechet means consistency gradient descent stochastic gradient descent


Electronic version of the publication:
https://arxiv.org/abs/1805.10833


Created from the Publication Database of the Vienna University of Technology.