Negative order sobolev cubatures: preconditioners of partial differential equation learning tasks circumventing numerical stiffness
Research output: Contribution to journal › Research article › Contributed › peer-review
Contributors
Abstract
We present a variational approach aimed at enhancing the training of physics-informed neural networks (PINNs) and more general surrogate models for learning partial differential equations (PDE). In particular, we extend our formerly introduced notion of Sobolev cubatures to negative orders, enabling the approximation of negative order Sobolev norms. We mathematically prove the effect of negative order Sobolev cubatures in improving the condition number of discrete PDE learning problems, providing balancing scalars that mitigate numerical stiffness issues caused by loss imbalances. Additionally, we consider polynomial surrogate models (PSMs), which maintain the flexibility of PINN formulations while preserving the convexity structure of the PDE operators. The combination of negative order Sobolev cubatures and PSMs delivers well-conditioned discrete optimization problems, solvable via an exponentially fast convergent gradient descent for λ-convex losses. Our theoretical contributions are supported by numerical experiments, addressing linear and non-linear, forward and inverse PDE problems. These experiments show that the Sobolev cubature-based PSMs emerge as the superior state-of-the-art PINN technique.
Details
Original language | English |
---|---|
Article number | 035029 |
Journal | Machine learning: science and technology |
Volume | 5 |
Issue number | 3 |
Publication status | Published - 1 Sept 2024 |
Peer-reviewed | Yes |
Keywords
ASJC Scopus subject areas
Keywords
- gradient flow, negative Sobolev norms, numerical stiffness, PDE learning, polynomial surrogate models, preconditioning, Sobolev cubatures