Low-Rank Adaptation of Neural Fields

📅 2025-04-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Neural fields (NFs) lack efficient encoding methods for lightweight dynamic updates. Method: This paper introduces Low-Rank Adaptation (LoRA) to the neural field domain for the first time, proposing a parameter-efficient and computationally lightweight framework for instance-level incremental editing. Unlike prior approaches, it does not rely on large pre-trained models; instead, it employs low-rank matrix decomposition to achieve compact parameter representation and rapid fine-tuning of neural fields, supporting multimodal tasks including image filtering, video compression, and geometric editing. Contribution/Results: Experiments demonstrate that the method reduces parameter count and GPU memory consumption by over 90% compared to baseline methods, enables real-time updates on resource-constrained devices, and maintains high-fidelity reconstruction quality. This work establishes a novel paradigm for personalized neural field adaptation and edge deployment.

Technology Category

Application Category

📝 Abstract
Processing visual data often involves small adjustments or sequences of changes, such as in image filtering, surface smoothing, and video storage. While established graphics techniques like normal mapping and video compression exploit redundancy to encode such small changes efficiently, the problem of encoding small changes to neural fields (NF) -- neural network parameterizations of visual or physical functions -- has received less attention. We propose a parameter-efficient strategy for updating neural fields using low-rank adaptations (LoRA). LoRA, a method from the parameter-efficient fine-tuning LLM community, encodes small updates to pre-trained models with minimal computational overhead. We adapt LoRA to instance-specific neural fields, avoiding the need for large pre-trained models yielding a pipeline suitable for low-compute hardware. We validate our approach with experiments in image filtering, video compression, and geometry editing, demonstrating its effectiveness and versatility for representing neural field updates.
Problem

Research questions and friction points this paper is trying to address.

Efficiently encoding small changes in neural fields
Adapting low-rank methods for neural field updates
Enabling lightweight neural field editing on low-compute hardware
Innovation

Methods, ideas, or system contributions that make the work stand out.

Low-rank adaptations for neural fields updates
Parameter-efficient strategy with minimal overhead
Suitable for low-compute hardware applications
🔎 Similar Papers
No similar papers found.
A
Anh Truong
Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology
Ahmed H. Mahmoud
Ahmed H. Mahmoud
MIT
HPCGeometry Processing3D Graphics
Mina Konaković Luković
Mina Konaković Luković
Assistant Professor, MIT
Computer GraphicsDigital Geometry ProcessingComputational FabricationMachine Learning
J
Justin Solomon
Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology