Empirical Inference

Metrizing Weak Convergence with Maximum Mean Discrepancies

2023

Article

ei


This paper characterizes the maximum mean discrepancies (MMD) that metrize the weak convergence of probability measures for a wide class of kernels. More precisely, we prove that, on a locally compact, non-compact, Hausdorff space, the MMD of a bounded continuous Borel measurable kernel k, whose RKHS-functions vanish at infinity (i.e., Hk ⊂ C0), metrizes the weak convergence of probability measures if and only if k is continuous and integrally strictly positive definite (∫ s.p.d.) over all signed, finite, regular Borel measures. We also correct a prior result of Simon-Gabriel and Schölkopf (JMLR 2018, Thm. 12) by showing that there exist both bounded continuous ∫ s.p.d. kernels that do not metrize weak convergence and bounded continuous non-∫ s.p.d. kernels that do metrize it

Author(s): Simon-Gabriel, C.-J. and Barp, A. and Schölkopf, B. and Mackey, L.
Journal: Journal of Machine Learning Research
Volume: 24
Year: 2023

Department(s): Empirical Inference
Research Project(s): Statistical Learning Theory
Bibtex Type: Article (article)
Paper Type: Journal

State: Published
URL: https://www.jmlr.org/papers/volume24/21-0599/21-0599.pdf

Links: arXiv

BibTex

@article{SimBarSchMac21,
  title = {Metrizing Weak Convergence with Maximum Mean Discrepancies},
  author = {Simon-Gabriel, C.-J. and Barp, A. and Sch{\"o}lkopf, B. and Mackey, L.},
  journal = {Journal of Machine Learning Research},
  volume = {24},
  year = {2023},
  doi = {},
  url = {https://www.jmlr.org/papers/volume24/21-0599/21-0599.pdf}
}