Efficient priors for scalable variational inference in Bayesian deep neural networks

R Krishnan, M Subedar…�- Proceedings of the IEEE�…, 2019 - openaccess.thecvf.com
Proceedings of the IEEE/CVF International Conference on�…, 2019openaccess.thecvf.com
Stochastic variational inference for Bayesian deep neural networks (DNNs) requires
specifying priors and approximate posterior distributions for neural network weights.
Specifying meaningful weight priors is a challenging problem, particularly for scaling
variational inference to deeper architectures involving high dimensional weight space.
Based on empirical Bayes approach, we propose Bayesian MOdel Priors Extracted from
Deterministic DNN (MOPED) method to choose meaningful prior distributions over weight�…
Abstract
Stochastic variational inference for Bayesian deep neural networks (DNNs) requires specifying priors and approximate posterior distributions for neural network weights. Specifying meaningful weight priors is a challenging problem, particularly for scaling variational inference to deeper architectures involving high dimensional weight space. Based on empirical Bayes approach, we propose Bayesian MOdel Priors Extracted from Deterministic DNN (MOPED) method to choose meaningful prior distributions over weight space using deterministic weights derived from the pretrained DNNs of equivalent architecture. We empirically evaluate the proposed approach on real-world applications including image classification, video activity recognition and audio classification tasks with varying complex neural network architectures. The proposed method enables scalable variational inference with faster training convergence and provides reliable uncertainty quantification.
openaccess.thecvf.com
Showing the best result for this search. See all results