🤖 AI Summary
This paper addresses the low visibility and inaccurate transaction forecasting in supply chain networks caused by unobservable internal production functions of enterprises. We propose a novel architecture integrating temporal graph neural networks with an interpretable inventory dynamics module. The method implicitly models production functions via attention mechanisms and employs a dedicated loss function for end-to-end training. Crucially, it embeds physical constraints—specifically, inventory evolution laws—directly into the learning process, thereby balancing model expressiveness and interpretability. Experiments on real-world supply chain data and the SupplySim simulation benchmark demonstrate that our approach improves production function inversion accuracy by 6%–50% over state-of-the-art baselines and reduces future transaction prediction error by 11%–62%. These gains significantly enhance supply chain transparency and forecasting robustness.
📝 Abstract
The global economy relies on the flow of goods over supply chain networks, with nodes as firms and edges as transactions between firms. While we may observe these external transactions, they are governed by unseen production functions, which determine how firms internally transform the input products they receive into output products that they sell. In this setting, it can be extremely valuable to infer these production functions, to improve supply chain visibility and to forecast future transactions more accurately. However, existing graph neural networks (GNNs) cannot capture these hidden relationships between nodes' inputs and outputs. Here, we introduce a new class of models for this setting by combining temporal GNNs with a novel inventory module, which learns production functions via attention weights and a special loss function. We evaluate our models extensively on real supply chains data and data generated from our new open-source simulator, SupplySim. Our models successfully infer production functions, outperforming the strongest baseline by 6%-50% (across datasets), and forecast future transactions, outperforming the strongest baseline by 11%-62%