π€ AI Summary
This paper addresses the core challenge of correspondence inconsistency in unsupervised many-to-many point-wise shape matching. To this end, we propose the Shape Graph Attention Network (SGAN), which models the manifold structure of a shape collection within a shared βuniverseβ embedding space. Methodologically: (1) we introduce a two-level cycle-consistency mechanism that jointly optimizes spatial- and spectral-domain mappings to ensure cross-shape correspondence consistency; (2) we design a universe predictor for robust projection of individual shapes into the shared universe space; and (3) we integrate spectral graph convolution with self-supervised learning to jointly optimize multi-shape correspondences in a unified latent space. Evaluated on FAUST, SURREAL, and other benchmarks, SGAN significantly outperforms state-of-the-art methods in matching accuracy, while demonstrating superior robustness to topological and geometric perturbations.
π Abstract
Establishing point-to-point correspondences across multiple 3D shapes is a fundamental problem in computer vision and graphics. In this paper, we introduce DcMatch, a novel unsupervised learning framework for non-rigid multi-shape matching. Unlike existing methods that learn a canonical embedding from a single shape, our approach leverages a shape graph attention network to capture the underlying manifold structure of the entire shape collection. This enables the construction of a more expressive and robust shared latent space, leading to more consistent shape-to-universe correspondences via a universe predictor. Simultaneously, we represent these correspondences in both the spatial and spectral domains and enforce their alignment in the shared universe space through a novel cycle consistency loss. This dual-level consistency fosters more accurate and coherent mappings. Extensive experiments on several challenging benchmarks demonstrate that our method consistently outperforms previous state-of-the-art approaches across diverse multi-shape matching scenarios. Code is available at https://github.com/YeTianwei/DcMatch.