Hi! I’m Vincent Dutordoir, a PhD candidate in Machine Learning at the University of Cambridge, working with professor Zoubin Ghahramani and Carl Henrik Ek.

Prior to my PhD, I was a senior Research Scientist at Secondmind, a position I continued in a part-time capacity during my PhD.

My research focuses on the intersection of statistical methods and deep learning, with applications in generative modelling and sequential decision making. Recently, I have been excited about defining stochastic processes using score-based diffusion models [1], [2].

Feel free to reach out on dutordoirv@gmail.com.

You can find some of my work on Github and a list of all my publications on Google scholar. I’m very happy to connect on Twitter or via e-mail on dutordoirv@gmail.com.

News

No matching items

Research

Highlighted research papers. A complete list can be found on my Google scholar.

Geometric Neural Diffusion Processes

Emile Mathieu, Vincent Dutordoir, Michael J. Hutchinson, Valentin De Bortoli, Yee Whye Teh, Richard E. Turner

Neural Information Processing Systems (NeurIPS 2023)

Score-based diffusion models that can be used to learn a generative model of functions.

Neural Diffusion Processes

Vincent Dutordoir, Alan Saul, Zoubin Ghahramani, Fergus Simpson

International Conference on Machine Learning (ICML 2023)

Generative models that define a probabilistic model over functions via their finite marginals using probabilistic diffusion models.

Deep neural networks as point estimates for deep Gaussian processes

Vincent Dutordoir, James Hensman, Mark van der Wilk, Carl Henrik Ek, Zoubin Ghahramani, Nicolas Durrande

Advances in Neural Information Processing Systems (NeurIPS 2021)

Shows the equivalence between the forward propogation of the mean in a Deep Gaussian process (GP) and a fully-connected neural network layer. Method can be used to better initialise deep GPs and understand the uncertainty in DNNs.

Sparse Gaussian Processes with Spherical Harmonic Features

Vincent Dutordoir, Nicolas Durrande, James Hensman

Internation Conference of Machine Learning (ICML 2020)

We derive the Reproducing Kernel Hilbert space for spherical kernels allowing us to design linear time Gausian processes (GPs).

No matching items