Tim G. J. Rudner: Inter-domain Deep Gaussian Processes

Abstract

Inter-domain Gaussian processes (GPs) allow for high flexibility and low computational cost when performing approximate inference in GP models. They are particularly suitable for modeling data exhibiting global structure but are limited to stationary covariance functions and thus fail to model non-stationary data effectively. We propose Inter-domain Deep Gaussian Processes, an extension of inter-domain shallow GPs that combines the advantages of inter-domain and deep Gaussian processes (DGPs), and demonstrate how to leverage existing approximate inference methods to perform simple and scalable approximate inference using inter-domain features in DGPs. We assess the performance of our method on a range of regression tasks and demonstrate that it outperforms inter-domain shallow GPs and conventional DGPs on challenging large-scale real-world datasets exhibiting both global structure as well as a high-degree of non-stationarity.

Date
August 13, 2020 16:00 — 17:00
Event
SML Seminar
Location
online

Bio

Tim G. J. Rudner is a PhD Candidate in the Department of Computer Science at the University of Oxford, supervised by Yarin Gal and Yee Whye Teh. His research interests span Bayesian deep learning, reinforcement learning, and variational inference. He holds a master’s degree in statistics from the University of Oxford and an undergraduate degree in mathematics and economics from Yale University. Tim is also an AI Fellow at Georgetown University's Center for Security and Emerging Technology (CSET), a Fellow of the German National Academic Foundation, and a Rhodes Scholar.
Avatar
Marc Deisenroth
DeepMind Chair in Artificial Intelligence