Training-free hyperparameter optimization of neural networks for electronic structures in matter

Publikation: Beitrag in FachzeitschriftForschungsartikelBeigetragenBegutachtung

Beitragende

  • Lenz Fiedler - , Professur für Strahlenphysik (gB/HZDR), Helmholtz-Zentrum Dresden-Rossendorf (Autor:in)
  • Nils Hoffmann - , Professur für Abfall- und Kreislaufwirtschaft (Autor:in)
  • Parvez Mohammed - , Technische Universität Dresden (Autor:in)
  • Gabriel A. Popoola - , Muons, Inc. (Autor:in)
  • Tamar Yovell - , Helmholtz-Zentrum Dresden-Rossendorf (Autor:in)
  • Vladyslav Oles - , Oak Ridge National Laboratory (Autor:in)
  • J. Austin Ellis - , Oak Ridge National Laboratory (Autor:in)
  • Sivasankaran Rajamanickam - , Sandia National Laboratories (Autor:in)
  • Attila Cangi - , Helmholtz-Zentrum Dresden-Rossendorf (Autor:in)

Abstract

A myriad of phenomena in materials science and chemistry rely on quantum-level simulations of the electronic structure in matter. While moving to larger length and time scales has been a pressing issue for decades, such large-scale electronic structure calculations are still challenging despite modern software approaches and advances in high-performance computing. The silver lining in this regard is the use of machine learning to accelerate electronic structure calculations—this line of research has recently gained growing attention. The grand challenge therein is finding a suitable machine-learning model during a process called hyperparameter optimization. This, however, causes a massive computational overhead in addition to that of data generation. We accelerate the construction of neural network models by roughly two orders of magnitude by circumventing excessive training during the hyperparameter optimization phase. We demonstrate our workflow for Kohn-Sham density functional theory, the most popular computational method in materials science and chemistry.

Details

OriginalspracheEnglisch
Aufsatznummer045008
FachzeitschriftMachine learning: science and technology
Jahrgang3
Ausgabenummer4
PublikationsstatusVeröffentlicht - 1 Dez. 2022
Peer-Review-StatusJa

Schlagworte

Schlagwörter

  • density functional theory, hyperparameter optimization, neural networks, surrogate model