Christian Walder: New Tricks for Estimating Gradients of Expectations


We derive a family of Monte Carlo estimators for gradients of expectations, which is related to the log-derivative trick, but involves pairwise interactions between samples. The first of these comes from either a) introducing and approximating an integral representation based on the fundamental theorem of calculus, or b) applying the reparameterisation trick to an implicit parameterisation under infinitesimal perturbation of the parameters. From the former perspective we generalise to a reproducing kernel Hilbert space representation, giving rise to locality parameter in the pairwise interactions mentioned above. The resulting estimators are unbiased and shown to offer an independent component of useful information in comparison with the log-derivative estimator. Promising analytical and numerical examples confirm the intuitions behind the new estimators.

September 30, 2019 11:00 — 12:00
SML Seminar
LT 130 (Huxley), Imperial College London


Christian Walder obtained a Bachelor of Engineering from the University of Queensland, a PhD in machine learning from the Max Planck Institute in Germany, and seven years' industrial experience applying advanced analytics in the finance and telecommunication industries. He is presently employed as a senior researcher at Australia's governmental research, CSIRO Data61, and an adjunct Professor at the Australian National University.
Marc Deisenroth
DeepMind Chair in Artificial Intelligence