I am a fourth-year Ph.D. student in the Amsterdam Machine Learning Lab (AMLab) and the AI4Science lab of the University of Amsterdam. I work on developing new machine learning algorithms that can help solve scientific research questions. My current interests are scientific machine learning, physics-inspired machine learning, inverse problems, generative models and self-supervised inference.

# David Ruhe

Ph.D. Student at the University of Amsterdam

# Posts

### C2C 4: Clifford Group Equivariant Neural Networks

The final post of the series discusses multivector-valued neural networks that are equivariant with respect to actions from the Clifford group. These actions act as orthogonal transformations, effectively making the network equivariant or invariant with respect to, e.g., rotations or translations. Since orthogonal equivariance can be achieved for any quadratic space, we carry out a Lorentz-equivariant high energy physics experiment.

### C2C 3: Geometric Clifford Algebra Networks

In this third post of the series, we further explore the geometric bias that Clifford algebras induce. After studying modern plane-based geometric algebra, we generalize the successful rotational layer to any (Euclidean) group action. Thereby, we discourage transformations that are geometrically ungrounded. The resulting networks can be regarded as geometric templates. We discuss the resulting networks on a large-scale shallow-water equations experiment.

### C2C 2: Clifford Neural Layers for PDE Modeling

This is the second post of the Complex to Clifford (C2C) series, in which we dive into complex and quaternion-valued networks, and build all the way up to Clifford group equivariant networks. Here, we discuss a recent paper that uses the Clifford algebra to construct neural network layers to accelerate PDE solving.

### C2C 1: Complex and Quaternion Neural Networks

This is a first post discussing a recent series of papers that build up to Clifford group equivariant models. In this series, we start with complex and quaternion neural networks, which can be generalized from a Clifford algebra perspective. Then we further include the Clifford algebra's geometric applications (which is why they are also known as geometric algebras). We then show how to incorporate a Clifford group equivariance constraint in such neural networks.

### Variational Diffusion Models

An exploration of denoising diffusion models using the paper "Variational Diffusion Models" by Kingma et al. (2021). It interprets these models as maximizers of the Evidence Lower Bound (ELBO) or, equivalently, as minimizers of the Kullback-Leibler divergence between the data and model distributions. A code implementation is also included.

# Selected Publications

## Clifford Group Equivariant Neural Networks (NeurIPS 2023 Oral)

### $\mathrm{E}(n)$ steerable equivariance using Clifford's geometric algebra.

David Ruhe, Johannes Brandstetter, Patrick Forré

## Geometric Clifford Algebra Networks (ICML 2023)

### Incorporating geometry into neural network transformations.

David Ruhe, Jayesh K. Gupta, Steven de Keninck, Max Welling, Johannes Brandstetter

## Self-Supervised Inference in State-Space Models (ICLR 2022)

### Learning Kalman filters with physics-informed models.

David Ruhe, Patrick Forré

## Normalizing Flows for Hierarchical Bayesian Analysis (ML4PHYS 2022)

### Inferring gravitational wave parameters using normalizing flows.

David Ruhe, Kaze Wong, Miles Cranmer, Patrick Forré

## Detecting Dispersed Radio Transients Using Convolutional Neural Networks (ASCOM)

### Using neural networks to find high-energy events in the deep radio universe.

David Ruhe, Mark Kuiack, Antonia Rowlinson, Ralph Wijers, Patrick Forré

Tweets by djjruhe