April 30, 2021
For probability measures on countable spaces we derive distributional limits for empirical entropic optimal transport quantities. More precisely, we show that the empirical optimal transport plan weakly converges to a centered Gaussian process and that the empirical entropic optimal transport value is asymptotically normal. The results are valid for a large class of cost functions and generalize distributional limits for empirical entropic optimal transport quantities on fini...
November 14, 2022
This paper is concerned with an optimization problem that is constrained by the Kantorovich optimal transportation problem. This bilevel optimization problem can be reformulated as a mathematical problem with complementarity constraints in the space of regular Borel measures. Because of the non-smoothness induced by the complementarity relations, problems of this type are frequently regularized. Here we apply a quadratic regularization of the Kantorovich problem. As the title...
March 7, 2005
We establish some quantitative concentration estimates for the empirical measure of many independent variables, in transportation distances. As an application, we provide some error bounds for particle simulations in a model mean field problem. The tools include coupling arguments, as well as regularity and moments estimates for solutions of certain diffusive partial differential equations.
April 24, 2003
In the first part of the paper we briefly decribe the classical problem, raised by Monge in 1781, of optimal transportation of mass. We discuss also Kantorovich's weak solution of the problem, which leads to general existence results, to a dual formulation, and to necessary and sufficient optimality conditions. In the second part we describe some recent progress on the problem of the existence of optimal transport maps. We show that in several cases optimal transport maps can...
November 16, 2022
We propose and investigate several statistical models and corresponding sampling schemes for data analysis based on unbalanced optimal transport (UOT) between finitely supported measures. Specifically, we analyse Kantorovich-Rubinstein (KR) distances with penalty parameter $C>0$. The main result provides non-asymptotic bounds on the expected error for the empirical KR distance as well as for its barycenters. The impact of the penalty parameter $C$ is studied in detail. Our ap...
September 6, 2010
The dual attainment of the Monge--Kantorovich transport problem is analyzed in a general setting. The spaces $X, Y$ are assumed to be polish and equipped with Borel probability measures $\mu$ and $\nu$. The transport cost function $c:\XY \to [0,\infty]$ is assumed to be Borel measurable. We show that a dual optimizer always exists, provided we interpret it as a projective limit of certain finitely additive measures. Our methods are functional analytic and rely on Fenchel's pe...
April 13, 2022
The central limit theorem is, with the strong law of large numbers, one of the two fundamental limit theorems in probability theory. Benjamin Jourdain and Alvin Tse have extended to non-linear functionals of the empirical measure of independent and identically distributed random vectors the central limit theorem which is well known for linear functionals. The main tool permitting this extension is the linear functional derivative, one of the notions of derivation on the Wasse...
May 18, 2007
We consider some general facts concerning convergence P_{n}-Q_{n}\to 0 as n\to \infty, where P_{n} and Q_{n} are probability measures in a complete separable metric space. The main point is that the sequences {P_{n}} and {Q_{n}} are not assumed to be tight. We compare different possible definitions of the above convergence, and establish some general properties.
February 4, 2020
A common way to quantify the ,,distance'' between measures is via their discrepancy, also known as maximum mean discrepancy (MMD). Discrepancies are related to Sinkhorn divergences $S_\varepsilon$ with appropriate cost functions as $\varepsilon \to \infty$. In the opposite direction, if $\varepsilon \to 0$, Sinkhorn divergences approach another important distance between measures, namely the Wasserstein distance or more generally optimal transport ,,distance''. In this chapte...
November 11, 2010
The aim of this article is to show that the Monge-Kantorovich problem is the limit of a sequence of entropy minimization problems when a fluctuation parameter tends down to zero. We prove the convergence of the entropic values to the optimal transport cost as the fluctuations decrease to zero, and we also show that the limit points of the entropic minimizers are optimal transport plans. We investigate the dynamic versions of these problems by considering random paths and desc...