Weight Space Representation Learning with Neural Fields

📅 2025-12-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the potential of neural network weights as structured representations, specifically addressing semantic modeling in the weight space of neural fields. To this end, we propose a multiplicative low-rank adaptation (LoRA) mechanism that imposes structured constraints on pretrained neural field backbone models, enabling the weights themselves to encode interpretable semantic information. Crucially, our method achieves this without introducing additional parameters, yielding high-quality, compact, and generalizable representations directly in weight space. Experimental results demonstrate substantial improvements over existing weight-space approaches across 2D/3D reconstruction and generation tasks. Our method supports fine-grained semantic editing and cross-task transferability. Furthermore, when integrated into latent diffusion frameworks, it enhances both generation fidelity and controllability.

Technology Category

Application Category

📝 Abstract
In this work, we investigate the potential of weights to serve as effective representations, focusing on neural fields. Our key insight is that constraining the optimization space through a pre-trained base model and low-rank adaptation (LoRA) can induce structure in weight space. Across reconstruction, generation, and analysis tasks on 2D and 3D data, we find that multiplicative LoRA weights achieve high representation quality while exhibiting distinctiveness and semantic structure. When used with latent diffusion models, multiplicative LoRA weights enable higher-quality generation than existing weight-space methods.
Problem

Research questions and friction points this paper is trying to address.

Learning weight representations for neural fields
Constraining optimization via pre-trained models and LoRA
Enhancing generation quality with multiplicative LoRA weights
Innovation

Methods, ideas, or system contributions that make the work stand out.

LoRA weights for structured weight space representation
Pre-trained base model constrains optimization space
Multiplicative LoRA enables high-quality latent diffusion generation
🔎 Similar Papers