🤖 AI Summary
Existing neural style transfer and diffusion models struggle to explicitly control the brushstroke composition and detail hierarchy inherent in artistic styles. This work proposes a regression-based image stylization method that leverages a procedurally generated, extensible set of brushstroke primitives to enable fine-grained manipulation of multidimensional attributes—including shape, size, orientation, density, color, and noise. By integrating a U-Net regression architecture, the approach preserves the structural integrity of the input image while allowing intuitive user-guided adjustments to stylistic details. Experimental results demonstrate that the method produces diverse, expressive, and highly customizable artistic images, effectively overcoming the black-box limitations of conventional style transfer techniques.
📝 Abstract
We present a novel, regression-based method for artistically styling images. Unlike recent neural style transfer or diffusion-based approaches, our method allows for explicit control over the stroke composition and level of detail in the rendered image through the use of an extensible set of stroke patches. The stroke patch sets are procedurally generated by small programs that control the shape, size, orientation, density, color, and noise level of the strokes in the individual patches. Once trained on a set of stroke patches, a U-Net based regression model can render any input image in a variety of distinct, evocative and customizable styles.