The major contributors of this repository include Guodong Zhang and Shengyang Sun. Note that this repo uses a modified version of tensorflow K-FAC.
A new repo was released with implementations of noisy K-FAC and noisy EK-FAC.
This repository contains the code to reproduce the classification results from the paper Noisy Natural Gradient as Variational Inference Paper, Video. (RL code see VIME-NNG)
Noisy Natural Gradient: Variational Inference can be instantiated as natural gradient with adaptive weight noise. By further approximating full Fisher with K-FAC, we get noisy K-FAC, a surprisingly simple variational training algorithm for Bayesian Neural Nets. Noisy K-FAC not only improves the classification accuracy, but also gives well-calibrated prediction.
Now, the implementation of convolution with multiple samples (which is very useful for Bayesian Neural Nets) is messy and slow, we plan to implement a new operation in tensorflow after NIPS.
To cite this work, please use
@article{zhang2017noisy,
title={Noisy Natural Gradient as Variational Inference},
author={Zhang, Guodong and Sun, Shengyang and Duvenaud, David and Grosse, Roger},
journal={arXiv preprint arXiv:1712.02390},
year={2017}
}
This project uses Python 3.5.2. Before running the code, you have to install
python main.py --config configs/kfac_plain.json
This implementation allows for the beautiful Tensorboard visualization. All you have to do is to launch Tensorboard from your experiment directory located in experiments/
.
tensorboard --logdir=experiments/cifar10/noisy-kfac/summaries