Negative order sobolev cubatures: preconditioners of partial differential equation learning tasks circumventing numerical stiffness
Publikation: Beitrag in Fachzeitschrift › Forschungsartikel › Beigetragen › Begutachtung
Beitragende
Abstract
We present a variational approach aimed at enhancing the training of physics-informed neural networks (PINNs) and more general surrogate models for learning partial differential equations (PDE). In particular, we extend our formerly introduced notion of Sobolev cubatures to negative orders, enabling the approximation of negative order Sobolev norms. We mathematically prove the effect of negative order Sobolev cubatures in improving the condition number of discrete PDE learning problems, providing balancing scalars that mitigate numerical stiffness issues caused by loss imbalances. Additionally, we consider polynomial surrogate models (PSMs), which maintain the flexibility of PINN formulations while preserving the convexity structure of the PDE operators. The combination of negative order Sobolev cubatures and PSMs delivers well-conditioned discrete optimization problems, solvable via an exponentially fast convergent gradient descent for λ-convex losses. Our theoretical contributions are supported by numerical experiments, addressing linear and non-linear, forward and inverse PDE problems. These experiments show that the Sobolev cubature-based PSMs emerge as the superior state-of-the-art PINN technique.
Details
Originalsprache | Englisch |
---|---|
Aufsatznummer | 035029 |
Fachzeitschrift | Machine learning: science and technology |
Jahrgang | 5 |
Ausgabenummer | 3 |
Publikationsstatus | Veröffentlicht - 1 Sept. 2024 |
Peer-Review-Status | Ja |
Schlagworte
ASJC Scopus Sachgebiete
Schlagwörter
- gradient flow, negative Sobolev norms, numerical stiffness, PDE learning, polynomial surrogate models, preconditioning, Sobolev cubatures