Data-Local Autonomous LLM-Guided Neural Architecture Search for Multiclass Multimodal Time-Series Classification

📅 2026-03-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
针对隐私敏感的多模态时序分类问题,提出一种数据本地化的LLM引导神经架构搜索方法,在不接触原始数据的前提下自动优化模型架构与预处理流程。

Technology Category

Application Category

📝 Abstract
Applying machine learning to sensitive time-series data is often bottlenecked by the iteration loop: Performance depends strongly on preprocessing and architecture, yet training often has to run on-premise under strict data-local constraints. This is a common problem in healthcare and other privacy-constrained domains (e.g., a hospital developing deep learning models on patient EEG). This bottleneck is particularly challenging in multimodal fusion, where sensor modalities must be individually preprocessed and then combined. LLM-guided neural architecture search (NAS) can automate this exploration, but most existing workflows assume cloud execution or access to data-derived artifacts that cannot be exposed. We present a novel data-local, LLM-guided search framework that handles candidate pipelines remotely while executing all training and evaluation locally under a fixed protocol. The controller observes only trial-level summaries, such as pipeline descriptors, metrics, learning-curve statistics, and failure logs, without ever accessing raw samples or intermediate feature representations. Our framework targets multiclass, multimodal learning via one-vs-rest binary experts per class and modality, a lightweight fusion MLP, and joint search over expert architectures and modality-specific preprocessing. We evaluate our method on two regimes: UEA30 (public multivariate time-series classification dataset) and SleepEDFx sleep staging (heterogeneous clinical modalities such as EEG, EOG, and EMG). The results show that the modular baseline model is strong, and the LLM-guided NAS further improves it. Notably, our method finds models that perform within published ranges across most benchmark datasets. Across both settings, our method reduces manual intervention by enabling unattended architecture search while keeping sensitive data on-premise.
Problem

Research questions and friction points this paper is trying to address.

data-local constraints
multimodal time-series classification
neural architecture search
privacy-preserving machine learning
on-premise training
Innovation

Methods, ideas, or system contributions that make the work stand out.

data-local NAS
LLM-guided architecture search
multimodal time-series classification
privacy-preserving machine learning
automated preprocessing
🔎 Similar Papers
No similar papers found.
E
Emil Hardarson
Reykjavik University, Reykjavik, Iceland
L
Luka Biedebach
Reykjavik University, Reykjavik, Iceland
Ó
Ómar Bessi Ómarsson
Reykjavik University, Reykjavik, Iceland
T
Teitur Hrólfsson
Reykjavik University, Reykjavik, Iceland
Anna Sigridur Islind
Anna Sigridur Islind
Professor, Reykjavik University
Information SystemsCo-DesignDigital PlatformsData-driven HealthcareDigital Health
María Óskarsdóttir
María Óskarsdóttir
Associate Professor University of Southampton and Reykjavík University
Data ScienceSocial Network AnalyticsBusiness AnalyticsMachine LearningNetwork Science