In this project led by
@leeley18
, we show that end-to-end training with differentiable physics results in extremely effective hybrid physics/ML models for density functional theory.
We discover that the prior knowledge embedded in the physics computation itself acts as an implicit regularization that greatly improves generalization of machine learning models for physics.
Please check out our recent paper:
This paper shows that we can use neural nets to improve* Kohn-Sham DFT, one of the most popular methods of computational chemistry!
* We still have lots of work to do to make this practical, e.g., the paper only models 1D systems.
It's also a real tour-de-force for automatic differentiation -- solving the Kohn-Sham equations requires solving a nested eigenvalue problem inside a fixed point iteration.