🤖 AI Summary
Visually impaired individuals rely on sighted guides for outdoor running, yet existing solutions focus solely on navigation or obstacle avoidance, neglecting real-time interpersonal gait rhythm coordination—leading to excessive dependence on verbal commands or physical tethers. This paper proposes the first human–human interaction framework centered on inter-personal rhythmic entrainment, implemented via a lightweight, non-visual haptic interface on smartwatches. It delivers vibrotactile pulses to synchronize step frequency between runner and guide, supporting both preset rhythms and dynamic gait-following modes, leveraging haptic entrainment to achieve bidirectional gait alignment. The system integrates adaptive step-frequency detection and low-latency synchronization algorithms, significantly reducing cognitive and communicative load while enhancing co-running naturalness and comfort. A field-deployable prototype validates feasibility in real-world settings and lays the groundwork for multimodal extensions, advancing socially integrated and autonomous running experiences for visually impaired users.
📝 Abstract
Visually impaired individuals often require a guide runner to safely participate in outdoor running. However, maintaining synchronized pacing with verbal cues or tethers can be mentally taxing and physically restrictive. Existing solutions primarily focus on navigation or obstacle avoidance but overlook the importance of real-time interpersonal rhythm coordination during running. We introduce RunPacer, a smartwatch-based vibrotactile feedback system that delivers synchronized rhythmic pulses to both runners. In contrast to conventional guide-running systems that rely heavily on continuous verbal communication or mechanical tethering, RunPacer emphasizes interpersonal cadence alignment as its core interaction model. By pre-setting a target step frequency or dynamically adapting to the guide's natural pace, the system ensures that both runners receive identical haptic cues, enabling them to maintain coordinated motion intuitively and efficiently. This poster presents the system architecture, positions it within prior research on haptic entrainment, and outlines the vision for future field deployment, including potential multimodal feedback extensions. RunPacer contributes a lightweight, socially cooperative, and non-visual assistive framework that reimagines co-running as a shared, embodied, and accessible experience.