Publication | Open Access
Kohn-Sham Equations as Regularizer: Building Prior Knowledge into Machine-Learned Physics
174
Citations
56
References
2021
Year
Prior knowledge is essential for physics machine learning, yet it is rarely incorporated directly into the physics computation itself. Training neural networks with Kohn‑Sham equations as an implicit regularizer yields models that learn the full one‑dimensional H₂ dissociation curve to chemical accuracy, generalize to unseen molecules, and eliminate self‑interaction error.
Including prior knowledge is important for effective machine learning models in physics, and is usually achieved by explicitly adding loss terms or constraints on model architectures. Prior knowledge embedded in the physics computation itself rarely draws attention. We show that solving the Kohn-Sham equations when training neural networks for the exchange-correlation functional provides an implicit regularization that greatly improves generalization. Two separations suffice for learning the entire one-dimensional H$_2$ dissociation curve within chemical accuracy, including the strongly correlated region. Our models also generalize to unseen types of molecules and overcome self-interaction error.
| Year | Citations | |
|---|---|---|
Page 1
Page 1