Vincent Adam: Variational inference for Gaussian Process models: multi-GP regression and time-series models


Variational inference (VI) is a versatile framework to perform approximate inference in probabilistic models. Recent progress made it widely applicable to Gaussian Process (GP) models: the use of sparse GPs as approximate posterior processes and the use of automatic differentiation. I will briefly review VI and sparse GPs and present how I contributed to the effort to scale VI for two classes of GP models: multi-GP models and time series models. I will illustrate these contributions with examples from neuroscience.

February 27, 2020 11:30 — 12:00
SML Seminar
AI Centre, UCL


Vincent Adam studied engineering and cognitive science in Paris before completing a PhD at the Gatsby Unit, UCL in between machine learning and computational neuroscience. Initially using the framework of Probabilistic modelling as a normative model for cognitive function (perception), he currently uses the same framework to develop interpretable and scalable machine learning algorithms at, with a focus on Gaussian Processes and time series modelling.
Marc Deisenroth
DeepMind Chair in Artificial Intelligence