OmniX: From Unified Panoramic Generation and Perception to Graphics-Ready 3D Scenes

📅 2025-10-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge that 2D panoramic images inherently lack the geometric and material fidelity required for physically based rendering (PBR), relighting, and simulation. To this end, we propose the first PBR-oriented panoramic-driven 3D scene generation framework. Our method jointly models panoramic geometry, texture, and PBR-compliant material properties—including albedo, roughness, and metallic—within a unified representation. A lightweight cross-modal adapter bridges pretrained 2D generative priors with panoramic understanding tasks, while a large-scale, multimodal synthetic panoramic dataset enables robust training. Unlike prior approaches focused solely on appearance reconstruction, ours is the first to achieve end-to-end, perception-guided generation of PBR-ready 3D scenes. Experiments demonstrate significant improvements over state-of-the-art methods on panoramic completion and 3D reconstruction. The generated scenes are directly compatible with real-time PBR rendering and virtual simulation, establishing a new paradigm for immersive virtual world construction.

Technology Category

Application Category

📝 Abstract
There are two prevalent ways to constructing 3D scenes: procedural generation and 2D lifting. Among them, panorama-based 2D lifting has emerged as a promising technique, leveraging powerful 2D generative priors to produce immersive, realistic, and diverse 3D environments. In this work, we advance this technique to generate graphics-ready 3D scenes suitable for physically based rendering (PBR), relighting, and simulation. Our key insight is to repurpose 2D generative models for panoramic perception of geometry, textures, and PBR materials. Unlike existing 2D lifting approaches that emphasize appearance generation and ignore the perception of intrinsic properties, we present OmniX, a versatile and unified framework. Based on a lightweight and efficient cross-modal adapter structure, OmniX reuses 2D generative priors for a broad range of panoramic vision tasks, including panoramic perception, generation, and completion. Furthermore, we construct a large-scale synthetic panorama dataset containing high-quality multimodal panoramas from diverse indoor and outdoor scenes. Extensive experiments demonstrate the effectiveness of our model in panoramic visual perception and graphics-ready 3D scene generation, opening new possibilities for immersive and physically realistic virtual world generation.
Problem

Research questions and friction points this paper is trying to address.

Generating graphics-ready 3D scenes from panoramic data
Repurposing 2D generative models for panoramic perception tasks
Creating physically realistic virtual environments with PBR materials
Innovation

Methods, ideas, or system contributions that make the work stand out.

Repurposing 2D generative models for panoramic perception
Using cross-modal adapter structure for unified framework
Generating graphics-ready 3D scenes with PBR materials
🔎 Similar Papers
No similar papers found.