This repository contains code to implement the coin sampling algorithms
described in Sharrock et al. (2023). The
basic implementation of the algorithms (e.g., Coin SVGD)
can be found in main.py
.
For some examples of how to use the code for the variou models considered in our paper, see the notebooks below.
File | Example |
---|---|
toy_svgd.ipynb |
Toy examples. |
bayes_ica.ipynb |
Bayesian independent component analysis. |
bayes_lr.ipynb |
Bayesian logistic regression. |
bayes_nn.ipynb |
Bayesian neural network. |
bayes_pmf.ipynb |
Bayesian probabilistic matrix factorisaton. |
If you find the code in this repository useful for your own research, please consider citing our paper:
@InProceedings{Sharrock2023,
title = {Coin Sampling: Gradient-Based Bayesian Inference without Learning Rates},
author = {Sharrock, Louis and Nemeth, Christopher},
booktitle = {Proceedings of The 40th International Conference on Machine Learning},
year = {2023},
city = {Honolulu, Hawaii},
}
Our implementations of Coin SVGD, Coin LAWGD, and Coin KSDD, are based on existing implementations of SVGD, LAWGD, and KSDD. We gratefully acknowledge the authors of the following papers for their open source code:
- Q. Liu and D. Wang. Stein Variational Gradient Descent (SVGD): A General Purpose Bayesian Inference Algorithm. NeurIPS, 2016. [Paper] | [Code].
- S. Chewi, T. Le Gouic, C. Lu, T. Maunu, P. Rigollet. SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence. NeurIPS, 2020. [Paper] | [Code].
- A. Korba, P.-C. Aubin-Frankowski, S. Majewski, P. Ablin. ICML 2021. [Paper] | [Code].
We did not contribute any of the datasets used in our experiments. Please get in touch if there are any conflicts of interest or other issues with hosting these datasets here.
- The Covertype dataset used in
bayes_lr.ipynb
is from the LIBSVM Data Repository. - The datasets used in
bayes_nn.ipynb
are from the UCI Machine Learning Repositorry. - The MovieLens dataset used in
bayes_pmf.ipynb
is from GroupLens.