I am a fourth-year Ph.D. student in the Amsterdam Machine Learning Lab (AMLab) and the AI4Science lab of the University of Amsterdam. I have also interned at Google Deepmind, Microsoft Research (AI4Science), and the Simon's Foundation's Flatiron Institute. My research interests include (3D) geometric deep learning, AI4Science, computer vision, generative models and time-series, often working at an intersection of these.
David Ruhe
Ph.D. Student at the University of Amsterdam
Selected Publications
Clifford Group Equivariant Neural Networks (NeurIPS 2023 Oral)
$\mathrm{E}(n)$ steerable equivariance using Clifford's geometric algebra.
David Ruhe, Johannes Brandstetter, Patrick Forré
Geometric Clifford Algebra Networks (Microsoft Research, ICML 2023)
Incorporating geometry into neural network transformations.
David Ruhe, Jayesh K. Gupta, Steven de Keninck, Max Welling, Johannes Brandstetter
Rolling Diffusion Models (Google Deepmind, ICML 2024)
A temporal prior for video diffusion.
David Ruhe, Jonathan Heek, Tim Salimans, Emiel Hoogeboom
Clifford-Steerable Convolutional Neural Networks (ICML 2024)
Steerable $\mathrm{E}(p, q)$ convolutions on multivector fields.
Maxim Zhdanov, David Ruhe$^*$, Maurice Weiler$^*$, Ana Lucic, Johannes Brandstetter, Patrick Forré
Clifford Group Equivariant Simplicial Message Passing Networks (ICLR 2024)
Equivariant $\mathrm{E}(n)$ message passing on simplicial complexes.
Cong Liu$^*$, David Ruhe$^*$, Floor Eijkelboom, Patrick Forré
Self-Supervised Inference in State-Space Models (ICLR 2022)
Learning Kalman filters with physics-informed models.
David Ruhe, Patrick Forré
Normalizing Flows for Hierarchical Bayesian Analysis (ML4PHYS 2022)
Inferring gravitational wave parameters using normalizing flows.
David Ruhe, Kaze Wong, Miles Cranmer, Patrick Forré
Posts
C2C 4: Clifford Group Equivariant Neural Networks
The final post of the series discusses multivector-valued neural networks that are equivariant with respect to actions from the Clifford group. These actions act as orthogonal transformations, effectively making the network equivariant or invariant with respect to, e.g., rotations or translations. Since orthogonal equivariance can be achieved for any quadratic space, we carry out a Lorentz-equivariant high energy physics experiment.
C2C 3: Geometric Clifford Algebra Networks
In this third post of the series, we further explore the geometric bias that Clifford algebras induce. After studying modern plane-based geometric algebra, we generalize the successful rotational layer to any (Euclidean) group action. Thereby, we discourage transformations that are geometrically ungrounded. The resulting networks can be regarded as geometric templates. We discuss the resulting networks on a large-scale shallow-water equations experiment.
C2C 2: Clifford Neural Layers for PDE Modeling
This is the second post of the Complex to Clifford (C2C) series, in which we dive into complex and quaternion-valued networks, and build all the way up to Clifford group equivariant networks. Here, we discuss a recent paper that uses the Clifford algebra to construct neural network layers to accelerate PDE solving.
C2C 1: Complex and Quaternion Neural Networks
This is a first post discussing a recent series of papers that build up to Clifford group equivariant models. In this series, we start with complex and quaternion neural networks, which can be generalized from a Clifford algebra perspective. Then we further include the Clifford algebra's geometric applications (which is why they are also known as geometric algebras). We then show how to incorporate a Clifford group equivariance constraint in such neural networks.
Variational Diffusion Models
An exploration of denoising diffusion models using the paper "Variational Diffusion Models" by Kingma et al. (2021). It interprets these models as maximizers of the Evidence Lower Bound (ELBO) or, equivalently, as minimizers of the Kullback-Leibler divergence between the data and model distributions. A code implementation is also included.
Tweets by djjruhe