DeepWeightFlow: Re-Basined Flow Matching for Generating Neural Network Weights

📅 2026-01-08
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of generative modeling in high-dimensional neural network weight spaces, where high dimensionality and strong permutation symmetries hinder existing methods from efficiently producing complete, high-performance large-scale network weights. The paper introduces DeepWeightFlow, which for the first time integrates Git Re-Basin with Flow Matching and incorporates TransFusion for weight normalization, effectively eliminating weight symmetries and enabling direct generation of diverse, high-accuracy network parameters in the original weight space. Without requiring fine-tuning, the method achieves strong transfer performance across diverse architectures and data modalities, generating hundreds of high-performing models within minutes. It substantially outperforms diffusion-based approaches, marking a significant advance in both generation efficiency and diversity.

Technology Category

Application Category

📝 Abstract
Building efficient and effective generative models for neural network weights has been a research focus of significant interest that faces challenges posed by the high-dimensional weight spaces of modern neural networks and their symmetries. Several prior generative models are limited to generating partial neural network weights, particularly for larger models, such as ResNet and ViT. Those that do generate complete weights struggle with generation speed or require finetuning of the generated models. In this work, we present DeepWeightFlow, a Flow Matching model that operates directly in weight space to generate diverse and high-accuracy neural network weights for a variety of architectures, neural network sizes, and data modalities. The neural networks generated by DeepWeightFlow do not require fine-tuning to perform well and can scale to large networks. We apply Git Re-Basin and TransFusion for neural network canonicalization in the context of generative weight models to account for the impact of neural network permutation symmetries and to improve generation efficiency for larger model sizes. The generated networks excel at transfer learning, and ensembles of hundreds of neural networks can be generated in minutes, far exceeding the efficiency of diffusion-based methods. DeepWeightFlow models pave the way for more efficient and scalable generation of diverse sets of neural networks.
Problem

Research questions and friction points this paper is trying to address.

neural network weight generation
high-dimensional weight space
permutation symmetries
generative modeling
weight space canonicalization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Flow Matching
Weight Generation
Git Re-Basin
Permutation Symmetry
Neural Network Canonicalization
🔎 Similar Papers
No similar papers found.
S
Saumya Gupta
Institute for Experiential AI, Northeastern University
S
Scott Biggs
Khoury College of Computer Sciences, Northeastern University
Moritz Laber
Moritz Laber
PhD student, Northeastern Universtiy
Z
Zohair Shafi
Khoury College of Computer Sciences, Northeastern University
Robin Walters
Robin Walters
Northeastern
Deep LearningRepresentation TheoryAlgebraic GeometryProtein Structure
Ayan Paul
Ayan Paul
Northeastern University
Computational BiologyParticle PhysicsInterpretable Machine LearningMathematical Epidemiology