ContextGPT: Infusing LLMs Knowledge into Neuro-Symbolic Activity Recognition Models

📅 2024-03-11
🏛️ International Conference on Smart Computing
📈 Citations: 5
Influential: 1
📄 PDF
🤖 AI Summary
To address the scarcity of labeled data and the high engineering cost associated with manually constructed logic-based knowledge bases in context-aware human activity recognition (HAR), this paper proposes a lightweight, scalable large language model (LLM)-driven neural-symbolic (NeSy) framework. Departing from explicit ontology modeling, our approach introduces a novel prompt-engineering-based mechanism for dynamic commonsense knowledge extraction—automatically retrieving structured, activity- and context-relevant priors from the implicit knowledge of LLMs (e.g., ContextGPT) and seamlessly injecting them into supervised deep learning classifiers. Experiments demonstrate that, under low-resource settings, our method matches or surpasses traditional logic-driven NeSy approaches in classification accuracy, while substantially reducing reliance on domain experts and manual knowledge engineering. This work establishes a new paradigm for interpretable HAR under data-scarce conditions.

Technology Category

Application Category

📝 Abstract
Context-aware Human Activity Recognition (HAR) is a hot research area in mobile computing, and the most effective solutions in the literature are based on supervised deep learning models. However, the actual deployment of these systems is limited by the scarcity of labeled data that is required for training. Neuro-Symbolic AI (NeSy) provides an interesting research direction to mitigate this issue, by infusing common-sense knowledge about human activities and the contexts in which they can be performed into HAR deep learning classifiers. Existing NeSy methods for context-aware HAR rely on knowledge encoded in logic-based models (e.g., ontologies) whose design, implementation, and maintenance to capture new activities and contexts require significant human engineering efforts, technical knowledge, and domain expertise. Recent works show that pre-trained Large Language Models (LLMs) effectively encode common-sense knowledge about human activities. In this work, we propose ContextGPT: a novel prompt engineering approach to retrieve from LLMs common-sense knowledge about the relationship between human activities and the context in which they are performed. Unlike ontologies, ContextGPT requires limited human effort and expertise, while sharing similar privacy concerns if the reasoning is performed in the cloud. An extensive evaluation using two public datasets shows how a NeSy model obtained by infusing common-sense knowledge from ContextGPT is effective in data scarcity scenarios, leading to similar (and sometimes better) recognition rates than logic-based approaches with a fraction of the effort.
Problem

Research questions and friction points this paper is trying to address.

Addresses scarcity of labeled data in Human Activity Recognition (HAR).
Reduces human effort in encoding common-sense knowledge for HAR.
Improves recognition rates in data scarcity scenarios using LLMs.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Infuses LLM knowledge into neuro-symbolic HAR models
Uses prompt engineering to retrieve activity-context relationships
Reduces human effort compared to logic-based approaches
🔎 Similar Papers
No similar papers found.
L
Luca Arrotta
EveryWare Lab, Dept. of Computer Science, University of Milan
Claudio Bettini
Claudio Bettini
Università degli Studi di Milano
Pervasive ComputingData PrivacyTemporal DataArtificial IntelligenceDigital Health
G
Gabriele Civitarese
EveryWare Lab, Dept. of Computer Science, University of Milan
Michele Fiori
Michele Fiori
University of Milan
Human Activity RecognitionExplainable AIFoundation Models