Bayesian Nets
- Deep Prior
- Weight Uncertainty in Neural Networks, e.g. Bayes by Backprop
- Follow-on paper: Bayesian Hypernetworks
- Train a neural net to output the distribution of the desired neural net.
- Once trained, input random noise to sample the parameters of the desired net.
- A lot of the algorithm has to do with being able to scale up to lots of parameters.
- Follow-on paper: Bayesian Hypernetworks
- Variational Dropout and the Local Reparameterization Trick
- Dropout as a bayesian approximation: Representing model uncertainty in deep learning
- An Approximate Bayesian Long Short-Term Memory Algorithm for Outlier Detection
- Probabilistic supervised learning
- Using Deep Neural Network Approximate Bayesian Network
- Bayesian Neural Networks
- Efficient Exploration through Bayesian Deep Q-Networks
- The last layer is a Bayesian Linear Regression Model
- Performs better than Double DQN and is easier to implement than others
- Bayesian Uncertainty Estimation for Batch Normalized Deep Networks
- Similar to the ‘Dropout as a bayesian approximation…’ paper, but using Batch Norm layers to calculate uncertainty
- Variational Inference for Policy Gradient
- Modern Computational Methods for Bayesian Inference — A Reading List
