Diverse Score Distillation

📅 2024-12-09
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing score distillation methods based on 2D diffusion models suffer from insufficient output diversity in 3D optimization, struggling to balance fidelity and sample richness. To address this, we propose Diversified Score Distillation (DSD), the first framework to explicitly model and integrate the stochasticity of diffusion sampling paths into 3D optimization. DSD leverages random initialization seeds to induce diverse optimization trajectories, and incorporates path-aware score modeling, differentiable rendering, and gradient approximation techniques to accommodate non-ideal rendering evolution. Evaluated on text-to-3D generation and single-view reconstruction tasks, DSD significantly enhances output diversity while preserving—often surpassing—the geometric and textural fidelity of state-of-the-art methods.

Technology Category

Application Category

📝 Abstract
Score distillation of 2D diffusion models has proven to be a powerful mechanism to guide 3D optimization, for example enabling text-based 3D generation or single-view reconstruction. A common limitation of existing score distillation formulations, however, is that the outputs of the (mode-seeking) optimization are limited in diversity despite the underlying diffusion model being capable of generating diverse samples. In this work, inspired by the sampling process in denoising diffusion, we propose a score formulation that guides the optimization to follow generation paths defined by random initial seeds, thus ensuring diversity. We then present an approximation to adopt this formulation for scenarios where the optimization may not precisely follow the generation paths (eg a 3D representation whose renderings evolve in a co-dependent manner). We showcase the applications of our `Diverse Score Distillation' (DSD) formulation across tasks such as 2D optimization, text-based 3D inference, and single-view reconstruction. We also empirically validate DSD against prior score distillation formulations and show that it significantly improves sample diversity while preserving fidelity.
Problem

Research questions and friction points this paper is trying to address.

Enhancing diversity in 3D optimization outputs
Overcoming limitations of mode-seeking score distillation
Ensuring diversity while preserving fidelity in generation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Diverse Score Distillation for 3D optimization
Follows random initial seeds for diversity
Approximation for co-dependent 3D rendering
🔎 Similar Papers
No similar papers found.