Flatness Improves Backbone Generalisation in Few-shot Classification

📅 2024-04-11
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Multi-domain few-shot classification (FSC) suffers from insufficient generalization of backbone networks, limiting both cross-domain and in-domain performance. Method: This paper identifies training flatness—a property of the loss landscape—as a critical factor governing backbone generalization in FSC. Departing from prevailing adapter-centric approaches, we propose a simple yet effective baseline: incorporating flatness regularization (e.g., Sharpness-Aware Minimization, SAM) during fine-tuning of pretrained backbones, without altering downstream adapter architectures. Contribution/Results: Extensive experiments demonstrate consistent and substantial improvements across diverse adapters—including ProtoNet and Meta-Baseline—on Mini-ImageNet, Tiered-ImageNet, and cross-domain CUB benchmarks, outperforming state-of-the-art methods. Crucially, this work provides the first systematic empirical validation of the principle that “flatter backbones yield stronger generalization,” establishing flatness-aware backbone optimization as a novel paradigm for few-shot learning.

Technology Category

Application Category

📝 Abstract
Deployment of deep neural networks in real-world settings typically requires adaptation to new tasks with few examples. Few-shot classification (FSC) provides a solution to this problem by leveraging pre-trained backbones for fast adaptation to new classes. Surprisingly, most efforts have only focused on developing architectures for easing the adaptation to the target domain without considering the importance of backbone training for good generalisation. We show that flatness-aware backbone training with vanilla fine-tuning results in a simpler yet competitive baseline compared to the state-of-the-art. Our results indicate that for in- and cross-domain FSC, backbone training is crucial to achieving good generalisation across different adaptation methods. We advocate more care should be taken when training these models.
Problem

Research questions and friction points this paper is trying to address.

Improves backbone generalization in few-shot classification
Simplifies multi-domain few-shot classification pipelines
Highlights importance of backbone training for adaptation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Flatness-aware training enhances backbone generalization.
Simplified backbone selection for multi-domain FSC.
Effective fine-tuning improves few-shot classification performance.
🔎 Similar Papers
No similar papers found.