Learning production functions for supply chains with graph neural networks

📅 2024-07-26
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the low visibility and inaccurate transaction forecasting in supply chain networks caused by unobservable internal production functions of enterprises. We propose a novel architecture integrating temporal graph neural networks with an interpretable inventory dynamics module. The method implicitly models production functions via attention mechanisms and employs a dedicated loss function for end-to-end training. Crucially, it embeds physical constraints—specifically, inventory evolution laws—directly into the learning process, thereby balancing model expressiveness and interpretability. Experiments on real-world supply chain data and the SupplySim simulation benchmark demonstrate that our approach improves production function inversion accuracy by 6%–50% over state-of-the-art baselines and reduces future transaction prediction error by 11%–62%. These gains significantly enhance supply chain transparency and forecasting robustness.

Technology Category

Application Category

📝 Abstract
The global economy relies on the flow of goods over supply chain networks, with nodes as firms and edges as transactions between firms. While we may observe these external transactions, they are governed by unseen production functions, which determine how firms internally transform the input products they receive into output products that they sell. In this setting, it can be extremely valuable to infer these production functions, to improve supply chain visibility and to forecast future transactions more accurately. However, existing graph neural networks (GNNs) cannot capture these hidden relationships between nodes' inputs and outputs. Here, we introduce a new class of models for this setting by combining temporal GNNs with a novel inventory module, which learns production functions via attention weights and a special loss function. We evaluate our models extensively on real supply chains data and data generated from our new open-source simulator, SupplySim. Our models successfully infer production functions, outperforming the strongest baseline by 6%-50% (across datasets), and forecast future transactions, outperforming the strongest baseline by 11%-62%
Problem

Research questions and friction points this paper is trying to address.

Infer hidden production functions
Improve supply chain visibility
Forecast future transactions accurately
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph neural networks
Inventory module
Attention weights
🔎 Similar Papers
No similar papers found.
S
Serina Chang
Stanford University, Department of Computer Science
Z
Zhiyin Lin
Stanford University, Department of Computer Science
B
Benjamin Yan
Stanford University, Department of Computer Science
S
Swapnil Bembde
Hitachi America, Ltd.
Q
Qi Xiu
Hitachi America, Ltd.
Chi Heem Wong
Chi Heem Wong
Principal Research Scientist, Hitachi America | Visiting Scholar, Stanford
Artificial IntelligenceDecision MakingMachine Learning
Yu Qin
Yu Qin
Peking University
Additive ManufacturingBone Implant
F
Frank Kloster
Hitachi America, Ltd.
Alex Luo
Alex Luo
Hitachi America, Ltd.
R
Raj Palleti
Stanford University, Department of Computer Science; Hitachi America, Ltd.
J
J. Leskovec
Stanford University, Department of Computer Science