Vincent Adam: Doubly Sparse Variational Gaussian Processes

Abstract

The use of Gaussian process models is typically limited to datasets with a few tens of thousands of observations due to their complexity and memory footprint. The two most commonly used methods to overcome this limitation are 1) the variational sparse approximation which relies on inducing points and 2) the state-space equivalent formulation of Gaussian processes, which can be seen as exploiting some sparsity in the precision matrix. We propose to take the best of both worlds: we show that the inducing point framework is still valid for state space models and that it can bring further computational and memory savings. This formulation allows for minibatching and can be applied to deep Gaussian process models.

Date
Oct 31, 2019 10:00 AM — 11:00 AM
Event
SML Seminar
Location
AI Centre, UCL

Bio

Vincent Adam studied engineering and cognitive science in France before completing a PhD in between these fields at the Gatsby Unit, UCL. There, he studied perception, framing it as a probabilistic inference problem, and using psychophysical methodology. He also developed the non-parametric regression methods he needed to analyze behavioral data. Since then, working at Prowler.io, his research has focused on deriving scalable, yet accurate, approximate inference algorithms (mainly variational) for models including Gaussian process priors.
Avatar
Marc Deisenroth
DeepMind Chair in Artificial Intelligence