π€ AI Summary
In handheld mobile devices, dynamically varying viewing distances cause perceptual target size distortion, significantly degrading dwell-based gaze interaction performance. To address this, we propose GAUIβan adaptive gaze interface that jointly integrates monocular-camera-based real-time distance estimation, pupil-to-screen geometric modeling, and gaze-triggered UI rendering to dynamically scale target sizes according to instantaneous viewing distance. In a two-phase user study with 24 participants, GAUI reduced task completion time and error rates compared to fixed large- and small-size baselines, while achieving the highest user preference. Our core contributions are: (1) a lightweight, context-aware dynamic adaptation framework for mobile gaze interaction; and (2) empirically grounded adaptive design guidelines specifically tailored for gaze-based interfaces on handheld devices.
π Abstract
Dwell input shows promise for handheld mobile contexts, but its performance is impacted by target size and viewing distance. While fixed target sizes suffice in static setups, in mobile settings, frequent posture changes alter viewing distances, which in turn distort perceived size and hinder dwell performance. We address this through GAUI, a Gaze-based Adaptive User Interface that dynamically resizes targets to maximise performance at the given viewing distance. In a two-phased study (N=24), GAUI leveraged the strengths of its distance-responsive design, outperforming the large UI static baseline in task time, and being less error-prone than the small UI static baseline. It was rated the most preferred interface overall. Participants reflected on using GAUI in six different postures. We discuss how their experience is impacted by posture, and propose guidelines for designing context-aware adaptive UIs for dwell interfaces on handheld mobile devices that maximise performance.