Hensman, J. and Lawrence, N.D. orcid.org/0000-0001-9258-1030 (Submitted: 2014) Nested Variational Compression in Deep Gaussian Processes. arXiv. (Unpublished)
Abstract
Deep Gaussian processes provide a flexible approach to probabilistic modelling of data using either supervised or unsupervised learning. For tractable inference approximations to the marginal likelihood of the model must be made. The original approach to approximate inference in these models used variational compression to allow for approximate variational marginalization of the hidden variables leading to a lower bound on the marginal likelihood of the model [Damianou and Lawrence, 2013]. In this paper we extend this idea with a nested variational compression. The resulting lower bound on the likelihood can be easily parallelized or adapted for stochastic variational inference.
Metadata
Item Type: | Article |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © 2014 The Author(s). |
Keywords: | stat.ML; stat.ML |
Dates: |
|
Institution: | The University of Sheffield |
Academic Units: | The University of Sheffield > Faculty of Engineering (Sheffield) > Department of Computer Science (Sheffield) |
Funding Information: | Funder Grant number MEDICAL RESEARCH COUNCIL MR/K022016/1 ENGINEERING AND PHYSICAL SCIENCE RESEARCH COUNCIL (EPSRC) EP/N014162/1 |
Depositing User: | Symplectic Sheffield |
Date Deposited: | 20 Jun 2017 08:37 |
Last Modified: | 20 Jun 2017 08:37 |
Published Version: | https://arxiv.org/abs/1412.1370 |
Status: | Unpublished |
Related URLs: | |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:116495 |