Two papers covering machine learning for atomistic systems and solving differentiable equations from the MLS group have been accepted at the top conference ICML 2024.

May 7, 2024

Vectorized Conditional Neural Fields: A Framework for Solving Time-dependent Parametric Partial Differential Equations

Authors: Jan Hagnberger, Marimuthu Kalimuthu, Daniel Musekamp, Mathias Niepert
Proceedings: In Proceedings of the 41st International Conference on Machine Learning (ICML 2024)
Abstract:
Transformer models are increasingly used for solving Partial Differential Equations (PDEs). Several adaptations have been proposed, all of which suffer from the typical problems of Transformers, such as quadratic memory and time complexity. Furthermore, all prevalent architectures for PDE solving lack at least one of several desirable properties of an ideal surrogate model, such as (i) generalization to PDE parameters not seen during training, (ii) spatial and temporal zero-shot super-resolution, (iii) continuous temporal extrapolation, (iv) applicability to PDEs of different dimensionalities, and (v) efficient inference for longer temporal rollouts. To address these limitations, we propose Vectorized Conditional Neural Fields (VCNEFs) representing the solution of time-dependent PDEs as neural fields. Contrary to prior methods, VCNEFs compute their solutions in parallel for multiple spatiotemporal query points and model their dependencies through attention mechanisms. Moreover, VCNEFs can condition the neural field on both the initial conditions and the parameters of the PDEs. An extensive set of experiments demonstrates that VCNEFs are competitive with and often outperform existing ML-based surrogate models.

Structure-Aware E(3)-Invariant Molecular Conformer Aggregation Networks

Authors: Duy Nguyen, Nina Lukashina, Tai Nguyen, Trung Nguyen, Nhat Ho, Jan Peters, Daniel Sonntag, Viktor Zaverkin, Mathias Niepert
Proceedings: In Proceedings of the 41st International Conference on Machine Learning (ICML 2024)
Abstract:
A molecule's 2D representation consists of its atoms, their attributes, and the molecule's covalent bonds. A 3D (geometric) representation of a molecule is called a conformer and consists of its atom types and Cartesian coordinates. Every conformer has a potential energy; the lower it is, the more likely it occurs in nature. Most existing machine learning methods for molecular property prediction consider either 2D molecular graphs or 3D conformer structure representations in isolation. Inspired by recent work on using ensembles of conformers in conjunction with 2D graph representations, we propose E(3)-invariant molecular conformer aggregation networks. The method integrates a molecule's 2D representation with that of multiple of its conformers. Contrary to prior work, we propose a novel 2D-3D aggregation mechanism based on a differentiable solver for the Fused Gromov-Wasserstein Barycenter problem and an efficient online conformer generation method based on distance geometry. We show that the proposed aggregation mechanism is E(3) invariant and we provide an efficient GPU implementation. Moreover, we demonstrate that the aggregation mechanism significantly helps outperform state-of-the-art property prediction methods on established datasets.

To the top of the page