🤖 AI Summary
This work introduces “Material Coating,” a novel image-based material editing task that simulates thin-layer material coverage while preserving the underlying object’s geometric details—contrasting with conventional material replacement methods that erase fine geometry. To formalize this task, we present the first large-scale synthetic dataset, DataCoat110K. We further propose a diffusion-based generative architecture that jointly models 2D albedo textures and physically based rendering (PBR) parameters—including roughness, metallicness, transmission, and thickness—to enable fine-grained, controllable material coating. Extensive experiments and user studies demonstrate that our method significantly outperforms existing material editing and transfer approaches in realism, geometric fidelity, and parametric controllability. This establishes a new paradigm for physics-aware, image-level material editing.
📝 Abstract
We introduce Material Coating, a novel image editing task that simulates applying a thin material layer onto an object while preserving its underlying coarse and fine geometry. Material coating is fundamentally different from existing"material transfer"methods, which are designed to replace an object's intrinsic material, often overwriting fine details. To address this new task, we construct a large-scale synthetic dataset (110K images) of 3D objects with varied, physically-based coatings, named DataCoat110K. We then propose CoatFusion, a novel architecture that enables this task by conditioning a diffusion model on both a 2D albedo texture and granular, PBR-style parametric controls, including roughness, metalness, transmission, and a key thickness parameter. Experiments and user studies show CoatFusion produces realistic, controllable coatings and significantly outperforms existing material editing and transfer methods on this new task.