Benjamin Chamberlain: A Continuous Perspective on Graph Neural Networks
Abstract
In this talk I will discuss several recent papers that develop new graph neural networks by considering their relation to continuous processes. I will discuss how graph neural networks can be arrived at as numerical schemes to solve differential equations, what they have to do with Perelman’s famous solution to the Poincare conjecture and how they are related to string theory.Graphs are fundamentally discrete structures and at first glance, treating them continuously does not appear to be a promising research direction. However, there are many examples where handling discrete objects as if they were continuous has been a catalyst to progress. Photons are now known to be discrete, but modelling quantum physical processes with continuous differential equations such as heat diffusion produced many great breakthroughs in classical physics and chemistry. In computer science, digital images are also discrete, but continuous tools such as diffusion based denoising are still widely used and the question of whether digital images are best modelled continuously or discretely remains a source of great philosophical debate. Even in ML, the most common approach to handling discrete objects is to embed them into a continuous space. I will show that for graph ML too, there is much to be gained from unlocking the magnificent toolbox of continuous mathematics.