Self-Supervised Learning at the Edge: The Cost of Labeling

📅 2025-07-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the energy-efficient deployment of self-supervised learning—particularly contrastive learning—on resource-constrained edge devices. We propose a customized lightweight training strategy, systematically evaluate the accuracy–energy trade-off, and, for the first time, quantify the energy cost of data annotation. To alleviate the high energy burden of full supervision, we further integrate a semi-supervised learning paradigm. Experiments demonstrate that our approach maintains competitive downstream performance (e.g., <2% accuracy drop in image classification), reduces computational overhead to 25% of the baseline, cuts memory footprint by 35%, and achieves up to 75% reduction in total energy consumption. Moreover, semi-supervised fine-tuning reduces annotation-related energy consumption by over 60%. Our work establishes a reproducible methodology and empirical benchmark for efficient, low-carbon representation learning on edge devices.

Technology Category

Application Category

📝 Abstract
Contrastive learning (CL) has recently emerged as an alternative to traditional supervised machine learning solutions by enabling rich representations from unstructured and unlabeled data. However, CL and, more broadly, self-supervised learning (SSL) methods often demand a large amount of data and computational resources, posing challenges for deployment on resource-constrained edge devices. In this work, we explore the feasibility and efficiency of SSL techniques for edge-based learning, focusing on trade-offs between model performance and energy efficiency. In particular, we analyze how different SSL techniques adapt to limited computational, data, and energy budgets, evaluating their effectiveness in learning robust representations under resource-constrained settings. Moreover, we also consider the energy costs involved in labeling data and assess how semi-supervised learning may assist in reducing the overall energy consumed to train CL models. Through extensive experiments, we demonstrate that tailored SSL strategies can achieve competitive performance while reducing resource consumption by up to 4X, underscoring their potential for energy-efficient learning at the edge.
Problem

Research questions and friction points this paper is trying to address.

Feasibility of SSL for edge devices with limited resources
Trade-offs between model performance and energy efficiency
Reducing labeling energy costs via semi-supervised learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Contrastive learning for unlabeled data
SSL techniques for edge devices
Energy-efficient semi-supervised learning
🔎 Similar Papers
No similar papers found.