Neural network exchange-correlation functionals, differentiable SCF solvers, and equivariant architectures to accelerate quantum chemistry from first principles.
I am a PhD researcher at the Technical University of Munich working at the intersection of machine learning and quantum chemistry. My goal is to make density functional theory faster and more accurate by learning its hardest component, the exchange-correlation functional, directly from data.
I develop equivariant graph neural networks that operate on molecular electronic structure, differentiable self-consistent field solvers in JAX, and training paradigms that leverage physical derivatives on the Grassmannian manifold of density matrices.
Learning the exchange-correlation functional of DFT using expressive neural network architectures, from semi-local mGGA models to fully equivariant graph-based functionals that see the molecular topology.
End-to-end differentiable self-consistent field calculations in JAX, enabling gradient-based training through the iterative Kohn-Sham procedure with custom backward passes through eigendecompositions and DIIS.
SE(3)-equivariant graph neural networks that respect the symmetries of molecular systems, building on frameworks like e3nn, NequIP, and EquiformerV2 for electronic structure prediction.
Learning intelligent initial guesses and convergence strategies for self-consistent field calculations to reduce the computational cost of density functional theory at scale.
Count words, characters, and tokens while compacting markdown-heavy text for tighter prompts and cleaner sharing.
Inspect molecular structures in an interactive viewer with alternate representations for geometry, graphs, and related views.
Convert between common quantum chemistry units quickly without leaving the browser or rebuilding your workflow.
Design ML architecture flowcharts interactively with color-coded blocks, snap alignment, and clean PDF figure export for papers.