Hyperparameter optimization is a crucial task in numerous applications of numerical modelling techniques. Methods as diverse as classical simulations and the great variety of machine learning techniques used nowadays, require an appropriate choice of their hyperparameters (HPs). While for classical simulations, calibration to measured data by numerical optimization techniques has a long tradition, the HPs of neural networks are often chosen by a mixture of grid search, random search and manual tuning. In the present study the expert tool “OmniOpt” is introduced, which allows to optimize the HPs of a wide range of problems, ranging from classical simulations to different kinds of neural networks. Thereby, the emphasis is on versatility and flexibility for the user in terms of the applications and the choice of its HPs to be optimized. Moreover, the optimization procedure – which is usually a very time-consuming task – should be performed in a highly parallel way on the HPC system Taurus at TU Dresden. To this end, a Bayesian stochastic optimization algorithm (TPE) has been implemented on the Taurus system and connected to a user-friendly graphical user interface (GUI). In addition to the automatic optimization service, there is a variety of tools for analyzing and graphically displaying the results of the optimization. The application of OmniOpt to a practical problem from material science is presented as an example.
|Titel||High Performance Computing - ISC High Performance Digital 2021 International Workshops, 2021, Revised Selected Papers|
|Redakteure/-innen||Heike Jagode, Hartwig Anzt, Hatem Ltaief, Piotr Luszczek|
|Herausgeber (Verlag)||Springer Science and Business Media B.V.|
|Publikationsstatus||Veröffentlicht - 13 Nov. 2021|
|Reihe||Lecture Notes in Computer Science|
|Titel||International Conference on High Performance Computing, ISC High Performance 2021|
|Dauer||24 Juni - 2 Juli 2021|