Text2Weight: Bridging Natural Language and Neural Network Weight Spaces

📅 2025-08-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing methods struggle to directly generate generalizable neural network weights from natural language descriptions. This paper proposes T2W, the first end-to-end text-to-weights framework. Built upon a diffusion Transformer architecture, T2W integrates CLIP text embeddings with prior-aware attention, and introduces hierarchical parameter modeling and weight-space adversarial augmentation. Its core innovation lies in constructing a generalizable joint text-weight representation space, enabling cross-task weight generation and text-guided model fusion. Evaluated on CIFAR-100, Caltech-256, and TinyImageNet, T2W-generated weights consistently outperform conventional optimization-based initialization methods—demonstrating superior quality, strong generalization across domains and architectures, and practical deployability.

Technology Category

Application Category

📝 Abstract
How far are we really from automatically generating neural networks? While neural network weight generation shows promise, current approaches struggle with generalization to unseen tasks and practical application exploration. To address this, we propose T2W, a diffusion transformer framework that generates task-specific weights conditioned on natural language descriptions. T2W hierarchically processes network parameters into uniform blocks, integrates text embeddings from CLIP via a prior attention mechanism, and employs adversarial training with weight-space augmentation to enhance generalization. Experiments on Cifar100, Caltech256, and TinyImageNet demonstrate T2W's ability to produce high-quality weights for unseen tasks, outperforming optimization-based initialization and enabling novel applications such as weight enhancement and text-guided model fusion. Our work bridges textual semantics with weight-space dynamics, supported by an open-source dataset of text-weight pairs, advancing the practicality of generative models in neural network parameter synthesis. Our code is available on Github.
Problem

Research questions and friction points this paper is trying to address.

Generating neural network weights from natural language descriptions
Improving generalization of weight generation to unseen tasks
Bridging textual semantics with neural network weight-space dynamics
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generates task-specific weights from text
Hierarchically processes parameters into blocks
Uses adversarial training with weight-space augmentation
🔎 Similar Papers
No similar papers found.