The Statistical Machine Learning group is a research group at UCL’s Centre for Artificial Intelligence. Our research expertise is in data-efficient machine learning, probabilistic modeling, and autonomous decision making.

If you are interested in joining the Statistical Machine Learning group, please check out our openings.

In order to leave the factory floors and research labs, future robots must abandon their stiff and pre-programmed movements and be …

The use of Gaussian process models is typically limited to datasets with a few tens of thousands of observations due to their …

I will discuss multiple-data-source prediction and modelling problems arising in a number of fields, for instance in omics-based …

We derive a family of Monte Carlo estimators for gradients of expectations, which is related to the log-derivative trick, but involves …

In machine learning, new learning algorithms are designed by borrowing ideas from optimization and statistics followed by an extensive …

Learning workable representations of dynamical systems is becoming an increasingly important problem in a number of application areas. By leveraging recent work connecting deep neural networks to systems of differential equations, we propose variational integrator networks, a class of neural network architectures designed to ensure faithful representations of the dynamics under study. This class of network architectures facilitates accurate long-term prediction, interpretability, and data-efficient learning, while still remaining highly flexible and capable of modeling complex behavior. We demonstrate that they can accurately learn dynamical systems from both noisy observations in phase space and from image pixels within which the unknown dynamics are embedded.

Mathematics for Machine Learning is a book that motivates people to learn mathematical concepts. The book is not intended to cover advanced machine learning techniques, because there are already plenty of books doing this. Instead, we aim to provide the necessary mathematical skills to read those other books.

The interpretation of Large Hadron Collider (LHC) data in the framework of Beyond the Standard Model (BSM) theories is hampered by the need to run computationally expensive event generators and detector simulators. Performing statistically convergent scans of high-dimensional BSM theories is consequently challenging, and in practice unfeasible for very high-dimensional BSM theories. We present here a new machine learning method that accelerates the interpretation of LHC data, by learning the relationship between BSM theory parameters and data. As a proof-of-concept, we demonstrate that this technique accurately predicts natural SUSY signal events in two signal regions at the High Luminosity LHC, up to four orders of magnitude faster than standard techniques. The new approach makes it possible to rapidly and accurately reconstruct the theory parameters of complex BSM theories, should an excess in the data be discovered at the LHC.

Bayesian optimization is a sample-efficient approach to global optimization that relies on theoretically motivated value heuristics (acquisition functions) to guide its search process. Fully maximizing acquisition functions produces the Bayes’ decision rule, but this ideal is difficult to achieve since these functions are frequently non-trivial to optimize. This statement is especially true when evaluating queries in parallel, where acquisition functions are routinely non-convex, high-dimensional, and intractable. We first show that acquisition functions estimated via Monte Carlo integration are consistently amenable to gradient-based optimization. Subsequently, we identify a common family of acquisition functions, including EI and UCB, whose characteristics not only facilitate but justify use of greedy approaches for their maximization.

Quickly discover relevant content by filtering publications.

Learning workable representations of dynamical systems is becoming an increasingly important problem in a number of application areas. …

Gibbs sampling is a Markov Chain Monte Carlo (MCMC) method often used in Bayesian learning. It is widely believed that MCMC methods are …

Mathematics for Machine Learning is a book that motivates people to learn mathematical concepts. The book is not intended to cover …

We are looking for a (Senior) Research Fellow at the intersection of climate science and machine learning.

We are looking for a (Senior) Research Fellow at the intersection of robotics and machine learning.

Rasmus Larsen visits SML

Samuel Cohen joins SML