🤖 AI Summary
This work addresses the challenge of modeling complex and poorly generalizable solution spaces of nonlinear partial differential equations, such as Burgers’ equation, by proposing an embedding approach that integrates physics-informed neural networks (PINNs) with a multi-head architecture. The method employs orthogonality-constrained multiple heads to model solutions across varying initial conditions and viscosity parameters within a shared latent space, implicitly performing principal component decomposition to prevent training degeneracy. The resulting low-dimensional embeddings possess clear physical interpretability and accurately capture the system’s dynamical characteristics using only a few dominant modes. Experimental results demonstrate rapid saturation of the embedding components, confirming the effectiveness of the approach in constructing compact, generalizable representations of the solution manifold.
📝 Abstract
Embeddings provide low-dimensional representations that organize complex function spaces and support generalization. They provide a geometric representation that supports efficient retrieval, comparison, and generalization. In this work we generalize the concept to Physics Informed Neural Networks. We present a method to construct solution embedding spaces of nonlinear partial differential equations using a multi-head setup, and extract non-degenerate information from them using principal component analysis (PCA). We test this method by applying it to viscous Burgers'equation, which is solved simultaneously for a family of initial conditions and values of the viscosity. A shared network body learns a latent embedding of the solution space, while linear heads map this embedding to individual realizations. By enforcing orthogonality constraints on the heads, we obtain a principal-component decomposition of the latent space that is robust to training degeneracies and admits a direct physical interpretation. The obtained components for Burgers'equation exhibit rapid saturation, indicating that a small number of latent modes captures the dominant features of the dynamics.