🤖 AI Summary
Existing hyperbolic feature enhancement methods operate under the closed-world assumption, rendering them inadequate for open-world settings characterized by dynamic class emergence and coexistence of known and unknown classes. To address this, we propose the first hyperbolic joint feature enhancement framework tailored for open environments. Our approach comprises three key components: (1) a collaborative enhancement mechanism for known and unknown classes within hyperbolic space; (2) a meta-learning-augmented neural ordinary differential equation (ODE) to model the evolutionary dynamics of unknown-class distributions; and (3) a hyperbolic structure-preserving regularizer coupled with an upper bound on the infinite-augmentation loss, ensuring geometric consistency and optimization stability. Evaluated across five open-world tasks—including class-incremental learning and few-shot open-set recognition—our method achieves significant performance gains. Results empirically validate the effectiveness and generalization robustness of infinite hyperbolic augmentation, transcending the limitations of conventional closed-world assumptions.
📝 Abstract
Feature augmentation generates novel samples in the feature space, providing an effective way to enhance the generalization ability of learning algorithms with hyperbolic geometry. Most hyperbolic feature augmentation is confined to closed-environment, assuming the number of classes is fixed (emph{i.e.}, seen classes) and generating features only for these classes. In this paper, we propose a hyperbolic dual feature augmentation method for open-environment, which augments features for both seen and unseen classes in the hyperbolic space. To obtain a more precise approximation of the real data distribution for efficient training, (1) we adopt a neural ordinary differential equation module, enhanced by meta-learning, estimating the feature distributions of both seen and unseen classes; (2) we then introduce a regularizer to preserve the latent hierarchical structures of data in the hyperbolic space; (3) we also derive an upper bound for the hyperbolic dual augmentation loss, allowing us to train a hyperbolic model using infinite augmentations for seen and unseen classes. Extensive experiments on five open-environment tasks: class-incremental learning, few-shot open-set recognition, few-shot learning, zero-shot learning, and general image classification, demonstrate that our method effectively enhances the performance of hyperbolic algorithms in open-environment.