Residual-Informed Learning of Solutions to Algebraic Loops

📅 2025-10-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address numerical ambiguity and inefficiency arising from algebraic loops in equation-based modeling languages (e.g., Modelica), this paper proposes an unsupervised, residual-driven neural surrogate modeling method. A feedforward neural network serves as the surrogate, with the algebraic loop residual explicitly incorporated into the loss function—enabling end-to-end, label-free training that eliminates reliance on supervised datasets. Crucially, residual minimization resolves solution multiplicity by steering convergence toward physically consistent solutions—not statistical averages. An integrated error-control strategy ensures numerical accuracy. Evaluated on the IEEE 14-Bus system, the method achieves a 60% speedup over conventional numerical solvers while maintaining comparable precision. The core innovation lies in the first use of algebraic loop residuals as an unsupervised learning signal for neural networks, uniquely balancing robustness, interpretability, and engineering practicality.

Technology Category

Application Category

📝 Abstract
This paper presents a residual-informed machine learning approach for replacing algebraic loops in equation-based Modelica models with neural network surrogates. A feedforward neural network is trained using the residual (error) of the algebraic loop directly in its loss function, eliminating the need for a supervised dataset. This training strategy also resolves the issue of ambiguous solutions, allowing the surrogate to converge to a consistent solution rather than averaging multiple valid ones. Applied to the large-scale IEEE 14-Bus system, our method achieves a 60% reduction in simulation time compared to conventional simulations, while maintaining the same level of accuracy through error control mechanisms.
Problem

Research questions and friction points this paper is trying to address.

Replacing algebraic loops with neural network surrogates
Training networks using residual without supervised datasets
Resolving ambiguous solutions to ensure consistent convergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses residual-informed machine learning for algebraic loops
Trains neural network with residual in loss function
Achieves faster simulation while maintaining accuracy
Felix Brandt
Felix Brandt
Professor of Computer Science, Technische Universität München
Game TheorySocial Choice TheoryAlgorithmic Game TheoryComputational Social ChoiceMicroeconomic Theory
A
Andreas Heuermann
Institute for Data Science Solutions, Bielefeld University of Applied Sciences and Arts, Germany
P
Philip Hannebohm
Institute for Data Science Solutions, Bielefeld University of Applied Sciences and Arts, Germany
B
Bernhard Bachmann
Institute for Data Science Solutions, Bielefeld University of Applied Sciences and Arts, Germany