PoseGaze-AHP: A Knowledge-Based 3D Dataset for AI-Driven Ocular and Postural Diagnosis

📅 2025-10-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing datasets model head pose and eye movements independently, hindering AI-assisted diagnosis of oculogenic abnormal head postures (AHP). To address this, we propose the first methodology for constructing a synchronized 3D head–eye motion dataset specifically designed for AI-driven oculogenic AHP analysis. Our approach jointly models head pose and gaze direction—unprecedented in prior work—and introduces a hierarchical, large language model (LLM)-based prompting strategy to structurally extract clinical knowledge with 91.92% accuracy. Leveraging the Neural Head Avatar (NHA) framework, we synthesize 7,920 high-fidelity 3D-rendered images covering diverse ophthalmic conditions. This dataset constitutes the first publicly available, synchronized, and geometrically precise 3D benchmark for clinically compliant and accurate AI-assisted AHP diagnosis—filling a critical gap in both ophthalmology and vision-based human-computer interaction research.

Technology Category

Application Category

📝 Abstract
Diagnosing ocular-induced abnormal head posture (AHP) requires a comprehensive analysis of both head pose and ocular movements. However, existing datasets focus on these aspects separately, limiting the development of integrated diagnostic approaches and restricting AI-driven advancements in AHP analysis. To address this gap, we introduce PoseGaze-AHP, a novel 3D dataset that synchronously captures head pose and gaze movement information for ocular-induced AHP assessment. Structured clinical data were extracted from medical literature using large language models (LLMs) through an iterative process with the Claude 3.5 Sonnet model, combining stepwise, hierarchical, and complex prompting strategies. The extracted records were systematically imputed and transformed into 3D representations using the Neural Head Avatar (NHA) framework. The dataset includes 7,920 images generated from two head textures, covering a broad spectrum of ocular conditions. The extraction method achieved an overall accuracy of 91.92%, demonstrating its reliability for clinical dataset construction. PoseGaze-AHP is the first publicly available resource tailored for AI-driven ocular-induced AHP diagnosis, supporting the development of accurate and privacy-compliant diagnostic tools.
Problem

Research questions and friction points this paper is trying to address.

Existing datasets separate head pose and gaze data
Lack integrated diagnostic approaches for abnormal head posture
Need AI-driven tools for synchronized ocular-postural assessment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Synchronously captures head pose and gaze movement
Uses LLMs to extract clinical data from literature
Transforms records into 3D images with NHA framework
🔎 Similar Papers
No similar papers found.
Saja Al-Dabet
Saja Al-Dabet
United Arab Emirates University
Natural Language ProcessingHealth InformaticsDeep LearningData Mining
Sherzod Turaev
Sherzod Turaev
College of Information Technology, United Arab Emirates University
Formal Languages and AutomataRegulated Rewriting SystemsDNA ComputingMachine Learning
Nazar Zaki
Nazar Zaki
Professor, College of Information Technology, UAEU
Artificial IntelligenceData ScienceComputer ScienceGraph MiningHealth Informatics
A
Arif O. Khan
Eye Institute, Cleveland Clinic Abu Dhabi, Abu Dhabi, United Arab Emirates; Cleveland Clinic Lerner College of Medicine of Case Western Reserve University, Cleveland, Ohio, USA
L
Luai Eldweik
Eye Institute, Cleveland Clinic Abu Dhabi, Abu Dhabi, United Arab Emirates; Cleveland Clinic Lerner College of Medicine of Case Western Reserve University, Cleveland, Ohio, USA