🤖 AI Summary
To address the challenge of vision modeling under few-shot settings (≤10 images per class) where transfer learning is inadmissible, this work proposes a data-efficient training paradigm that explicitly embeds visual inductive priors. Methodologically, it employs end-to-end training without pretraining, leveraging a hybrid Transformer-CNN architecture, aggressive data augmentation, and large-scale ensembling, while systematically designing prior-guided model architectures and regularization mechanisms. Evaluated on the VIPriors benchmark—where transfer learning is strictly prohibited across four editions—the approach achieves significant gains over baselines under extremely low-data regimes. Key contributions include: (i) establishing the first reproducible, “from-scratch” few-shot vision benchmark; (ii) empirically demonstrating that explicit modeling of inductive priors critically alleviates data hunger; and (iii) providing a principled, theoretically grounded, and engineering-practical framework for data-constrained vision learning.
📝 Abstract
Deep Learning requires large amounts of data to train models that work well. In data-deficient settings, performance can be degraded. We investigate which Deep Learning methods benefit training models in a data-deficient setting, by organizing the"VIPriors: Visual Inductive Priors for Data-Efficient Deep Learning"workshop series, featuring four editions of data-impaired challenges. These challenges address the problem of training deep learning models for computer vision tasks with limited data. Participants are limited to training models from scratch using a low number of training samples and are not allowed to use any form of transfer learning. We aim to stimulate the development of novel approaches that incorporate prior knowledge to improve the data efficiency of deep learning models. Successful challenge entries make use of large model ensembles that mix Transformers and CNNs, as well as heavy data augmentation. Novel prior knowledge-based methods contribute to success in some entries.