🤖 AI Summary
This study addresses the “Midas Touch” problem in gaze-based interaction within mixed reality (MR), where visual attention is often conflated with operational intent. To resolve this, the authors propose an implicit intent detection approach that fuses electroencephalography (EEG) with eye-tracking data. They demonstrate for the first time that the stimulus-preceding negativity (SPN) reflects anticipatory uncertainty rather than motor preparation, establishing SPN as a neural correlate of intentional action. By synchronously recording EEG and eye-tracking signals and applying a deep learning model for personalized intent decoding, the method achieves classification accuracies ranging from 75% to 97% across 28 participants. These results validate the efficacy and innovative potential of SPN as a neurophysiological marker for designing intent-aware MR interfaces that operate without explicit user confirmation.
📝 Abstract
Mixed Reality (MR) interfaces increasingly rely on gaze for interaction , yet distinguishing visual attention from intentional action remains difficult, leading to the Midas Touch problem. Existing solutions require explicit confirmations, while brain-computer interfaces may provide an implicit marker of intention using Stimulus-Preceding Negativity (SPN). We investigated how Intention (Select vs. Observe) and Feedback (With vs. Without) modulate SPN during gaze-based MR interactions. During realistic selection tasks, we acquired EEG and eye-tracking data from 28 participants. SPN was robustly elicited and sensitive to both factors: observation without feedback produced the strongest amplitudes, while intention to select and expectation of feedback reduced activity, suggesting SPN reflects anticipatory uncertainty rather than motor preparation. Complementary decoding with deep learning models achieved reliable person-dependent classification of user intention, with accuracies ranging from 75% to 97% across participants. These findings identify SPN as an implicit marker for building intention-aware MR interfaces that mitigate the Midas Touch.