Research Blog

Matérn Gaussian Processes on Graphs

Gaussian processes are a model class for learning unknown functions from data. They are particularly of interest in statistical decision-making systems, due to their ability to quantify and propagate uncertainty. In this work, we study analogs of the popular Matérn class where the domain of the Gaussian process is replaced by a weighted undirected graph.

Probabilistic Active Meta Learning (PAML)

Meta-learning can make machine learning algorithms more data-efficient, using experience from prior tasks to learn related tasks quicker. Since some tasks will be more-or-less informative with respect to performance on any given task in a domain, an interesting question to consider is, how might a meta-learning algorithm automatically choose an informative task to learn? Here we summarise probabilistic active meta-learning (PAML): a meta-learning algorithm that uses latent task representations to rank and select informative tasks to learn next.

Variational Integrator Networks

Learning models of physical systems can be tricky, but exploiting inductive biases about the nature of the system can speed up learning significantly. In the following, we will give a brief overview and the key insights behind variational integrator networks.

High-dimensional Bayesian Optimization Using Low-dimensional Feature Spaces

Bayesian optimization is a powerful technique for the optimization of expensive black-box functions, but typically limited to low-dimensional problems. Here, we extend this setting to higher dimensions by learning a lower-dimensional embedding within which we optimize the black-box function.

Aligning Time Series on Incomparable Space

Data is often gathered sequentially in the form of a time series, which consists of sequences of data points observed at successive time points. Dynamic time warping (DTW) defines a meaningful similarity measure between two time series. Often times, the pairs of time series we are interested in are defined on different spaces: for instance, one might want to align a video with a corresponding audio wave, potentially sampled at different frequencies.

Estimating Barycenters of Measures in High Dimensions

Barycenters summarize populations of measures, but computing them does not scale to high dimensions with existing methods. We propose a scalable algorithm for estimating barycenters in high dimensions by turning the optimization over measures into a more tractable optimization over a space of generative models.

Efficiently Sampling Functions from Gaussian Process Posteriors

Efficient sampling from Gaussian process posteriors is relevant in practical applications. With Matheron’s rule we decouple the posterior, which allows us to sample functions from the Gaussian process posterior in linear time.

Healing Products of Gaussian Process Experts

Products of Gaussian process experts commonly suffer from poor performance when experts are weak. We propose aggregations and weighting approaches to heal these expert models.

Matérn Gaussian Processes on Riemannian Manifolds

Gaussian processes are a useful technique for modeling unknown functions. They are used in many application areas, particularly in cases where quantifying uncertainty is important, such as in strategic decision-making systems. We study how to extend this model class to model functions whose domain is a Riemannian manifold, for example, a sphere or cylinder. We do so in a manner which is (a) mathematically well-posed, and (b) constructive enough to allow the kernel to be computed, thereby allowing said processes to be trained with standard methods.