🤖 AI Summary
Existing robotic drawing systems struggle to model the complexity and stylistic diversity of human brushstrokes, resulting in limited artistic expressiveness. To address this, we propose the first differentiable spline-based stroke model tailored for real-world robotic painting: artist trajectories are captured via motion capture; a variational autoencoder (VAE) learns a compact latent stroke space; and, for the first time, learnable spline dynamics are embedded within the closed-loop control architecture—overcoming fundamental limitations of conventional Bézier curves in both morphological fidelity and semantic expressivity. Our model enables sample-efficient training, editable style transfer, and semantically grounded stroke planning. Integrated on the FRIDA robotic platform and evaluated through human perception studies, our approach significantly outperforms prior systems in stroke artistry, human likeness, semantic planning accuracy, and stylistic diversity.
📝 Abstract
A painting is more than just a picture on a wall; a painting is a process comprised of many intentional brush strokes, the shapes of which are an important component of a painting's overall style and message. Prior work in modeling brush stroke trajectories either does not work with real-world robotics or is not flexible enough to capture the complexity of human-made brush strokes. In this work, we introduce Spline-FRIDA which can model complex human brush stroke trajectories. This is achieved by recording artists drawing using motion capture, modeling the extracted trajectories with an autoencoder, and introducing a novel brush stroke dynamics model to the existing robotic painting platform FRIDA. We conducted a survey and found that our open-source Spline-FRIDA approach successfully captures the stroke styles in human drawings and that Spline-FRIDA's brush strokes are more human-like, improve semantic planning, and are more artistic compared to existing robot painting systems with restrictive B'ezier curve strokes.