🤖 AI Summary
To address numerical ambiguity and inefficiency arising from algebraic loops in equation-based modeling languages (e.g., Modelica), this paper proposes an unsupervised, residual-driven neural surrogate modeling method. A feedforward neural network serves as the surrogate, with the algebraic loop residual explicitly incorporated into the loss function—enabling end-to-end, label-free training that eliminates reliance on supervised datasets. Crucially, residual minimization resolves solution multiplicity by steering convergence toward physically consistent solutions—not statistical averages. An integrated error-control strategy ensures numerical accuracy. Evaluated on the IEEE 14-Bus system, the method achieves a 60% speedup over conventional numerical solvers while maintaining comparable precision. The core innovation lies in the first use of algebraic loop residuals as an unsupervised learning signal for neural networks, uniquely balancing robustness, interpretability, and engineering practicality.
📝 Abstract
This paper presents a residual-informed machine learning approach for replacing algebraic loops in equation-based Modelica models with neural network surrogates. A feedforward neural network is trained using the residual (error) of the algebraic loop directly in its loss function, eliminating the need for a supervised dataset. This training strategy also resolves the issue of ambiguous solutions, allowing the surrogate to converge to a consistent solution rather than averaging multiple valid ones. Applied to the large-scale IEEE 14-Bus system, our method achieves a 60% reduction in simulation time compared to conventional simulations, while maintaining the same level of accuracy through error control mechanisms.