Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
-
Updated
Oct 20, 2023 - Jupyter Notebook
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
A primer on Bayesian Neural Networks. The aim of this reading list is to facilitate the entry of new researchers into the field of Bayesian Deep Learning, by providing an overview of key papers. More details: "A Primer on Bayesian Neural Networks: Review and Debates"
An elegant adaptive importance sampling algorithms for simulations of multi-modal distributions (NeurIPS'20)
Samplers from the paper "Stochastic Gradient MCMC with Repulsive Forces"
SGLD and cSGLD as a PyTorch Optimizer
PyTorch wrapper for Deep Density Estimation Models
Add a description, image, and links to the sgld topic page so that developers can more easily learn about it.
To associate your repository with the sgld topic, visit your repo's landing page and select "manage topics."