🤖 AI Summary
Existing 3D Gaussian splatting animation methods typically rely on multi-view cameras, lengthy preprocessing pipelines, or high-end GPUs; lightweight alternatives often compromise visual fidelity. This paper introduces the first real-time skinning animation framework based on Gaussian lattices, enabling high-fidelity, low-latency 3D avatar generation and rendering via per-point parallel rasterization and dynamic weight binding. The system supports monocular smartphone scanning, with an end-to-end workflow requiring only ~5 minutes (including just 30 seconds for character generation), no specialized hardware, and zero offline preprocessing. Heavily optimized for mobile devices, it achieves real-time interactive rendering on commodity smartphones and VR headsets, balancing visual quality with cross-platform lightweight performance. Our approach delivers a scalable, real-time 3D character solution tailored for social media and metaverse applications.
📝 Abstract
We present Instant Skinned Gaussian Avatars, a real-time and cross-platform 3D avatar system. Many approaches have been proposed to animate Gaussian Splatting, but they often require camera arrays, long preprocessing times, or high-end GPUs. Some methods attempt to convert Gaussian Splatting into mesh-based representations, achieving lightweight performance but sacrificing visual fidelity. In contrast, our system efficiently animates Gaussian Splatting by leveraging parallel splat-wise processing to dynamically follow the underlying skinned mesh in real time while preserving high visual fidelity. From smartphone-based 3D scanning to on-device preprocessing, the entire process takes just around five minutes, with the avatar generation step itself completed in only about 30 seconds. Our system enables users to instantly transform their real-world appearance into a 3D avatar, making it ideal for seamless integration with social media and metaverse applications. Website: https://sites.google.com/view/gaussian-vrm