Neural Network Interoperability Across Platforms

📅 2025-11-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Neural network migration across mainstream frameworks (e.g., PyTorch and TensorFlow) remains challenging due to manual reconstruction requirements, poor compatibility, and semantic discrepancies. To address this, we propose a fully automated cross-framework migration method based on a hub-style intermediate representation (Hub IR). Our approach constructs a unified model IR via abstract syntax tree parsing, then performs semantic-aware structural mapping and framework-specific code generation to achieve bidirectional, functionally equivalent model translation. We systematically resolve two core challenges: cross-framework semantic divergence and topological structure mismatch—addressed for the first time in a unified framework. Experimental evaluation on five representative neural networks demonstrates functional equivalence of generated code, over 90% reduction in manual intervention, and substantial improvements in migration reliability and development efficiency.

Technology Category

Application Category

📝 Abstract
The development of smart systems (i.e., systems enhanced with AI components) has thrived thanks to the rapid advancements in neural networks (NNs). A wide range of libraries and frameworks have consequently emerged to support NN design and implementation. The choice depends on factors such as available functionalities, ease of use, documentation and community support. After adopting a given NN framework, organizations might later choose to switch to another if performance declines, requirements evolve, or new features are introduced. Unfortunately, migrating NN implementations across libraries is challenging due to the lack of migration approaches specifically tailored for NNs. This leads to increased time and effort to modernize NNs, as manual updates are necessary to avoid relying on outdated implementations and ensure compatibility with new features. In this paper, we propose an approach to automatically migrate neural network code across deep learning frameworks. Our method makes use of a pivot NN model to create an abstraction of the NN prior to migration. We validate our approach using two popular NN frameworks, namely PyTorch and TensorFlow. We also discuss the challenges of migrating code between the two frameworks and how they were approached in our method. Experimental evaluation on five NNs shows that our approach successfully migrates their code and produces NNs that are functionally equivalent to the originals. Artefacts from our work are available online.
Problem

Research questions and friction points this paper is trying to address.

Automating neural network migration between deep learning frameworks
Addressing interoperability challenges across platforms like PyTorch and TensorFlow
Eliminating manual effort to modernize outdated neural network implementations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Automatically migrates neural network code across frameworks
Uses pivot model to abstract neural networks before migration
Validated on PyTorch and TensorFlow producing equivalent networks
🔎 Similar Papers
No similar papers found.
N
Nadia Daoudi
Luxembourg Institute of Science and Technology
I
Iván Alfonso
Luxembourg Institute of Science and Technology
Jordi Cabot
Jordi Cabot
Head of the Software Engineering RDI Unit at Luxembourg Institute of Science and Technology (LIST)
software engineeringmodelingopen sourcelow-codeAI