Example-Based Feature Painting on Textures

📅 2025-11-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Controlling localized artifacts—such as stains, scratches, holes, wear, and discoloration—in photorealistic texture generation remains challenging due to the lack of semantic supervision and precise spatial control. Method: This paper proposes an unsupervised, interactive texture editing framework that jointly leverages unsupervised anomaly detection and feature-space clustering to automatically discover semantically consistent local appearance variations. It integrates conditional GANs and diffusion models for fine-grained, feature-driven editing and introduces an infinite static texture generation algorithm enabling arbitrary-resolution output. Contribution/Results: The method requires no manual annotations—only a small set of input images—and achieves high-fidelity, controllable artifact synthesis and editing. Experiments demonstrate state-of-the-art performance in visual realism, editing accuracy, and resolution scalability, supporting both interactive refinement and seamless tiling at user-specified resolutions.

Technology Category

Application Category

📝 Abstract
In this work, we propose a system that covers the complete workflow for achieving controlled authoring and editing of textures that present distinctive local characteristics. These include various effects that change the surface appearance of materials, such as stains, tears, holes, abrasions, discoloration, and more. Such alterations are ubiquitous in nature, and including them in the synthesis process is crucial for generating realistic textures. We introduce a novel approach for creating textures with such blemishes, adopting a learning-based approach that leverages unlabeled examples. Our approach does not require manual annotations by the user; instead, it detects the appearance-altering features through unsupervised anomaly detection. The various textural features are then automatically clustered into semantically coherent groups, which are used to guide the conditional generation of images. Our pipeline as a whole goes from a small image collection to a versatile generative model that enables the user to interactively create and paint features on textures of arbitrary size. Notably, the algorithms we introduce for diffusion-based editing and infinite stationary texture generation are generic and should prove useful in other contexts as well. Project page: https://reality.tf.fau.de/pub/ardelean2025examplebased.html
Problem

Research questions and friction points this paper is trying to address.

Automating texture editing with distinctive local features like stains and holes
Enabling unsupervised learning for realistic material appearance alterations
Providing interactive tools for conditional generation of infinite textures
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unsupervised anomaly detection for texture feature identification
Automatic clustering of features into semantic groups
Interactive diffusion-based editing for infinite texture generation
🔎 Similar Papers
No similar papers found.