Model-Guided Microstimulation Steers Primate Visual Behavior

📅 2025-10-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current visual prostheses are constrained by hardware resolution limitations and the low-level representational capacity of early visual cortex, hindering elicitation of complex, object-level perception. To address this, we propose a computation model–driven causal intervention paradigm. We introduce the first unified computational framework integrating a perturbation module, a topologically informed functional model of high-level visual cortex, and a cross-species neural mapping mechanism—enabling precise modeling and microelectrical stimulation–guided modulation of neural activity in primate higher visual areas (e.g., inferotemporal cortex). In macaque visual recognition tasks, our approach significantly modulates perceptual choice behavior, with strong alignment between model predictions and behavioral responses (r > 0.8), and successfully evokes object-level neural representations resembling “face pareidolia.” These results establish a novel pathway toward overcoming fundamental bottlenecks in existing visual prostheses and advancing next-generation neurointerfaces capable of decoding semantically meaningful visual content.

Technology Category

Application Category

📝 Abstract
Brain stimulation is a powerful tool for understanding cortical function and holds promise for therapeutic interventions in neuropsychiatric disorders. Initial visual prosthetics apply electric microstimulation to early visual cortex which can evoke percepts of simple symbols such as letters. However, these approaches are fundamentally limited by hardware constraints and the low-level representational properties of this cortical region. In contrast, higher-level visual areas encode more complex object representations and therefore constitute a promising target for stimulation - but determining representational targets that reliably evoke object-level percepts constitutes a major challenge. We here introduce a computational framework to causally model and guide stimulation of high-level cortex, comprising three key components: (1) a perturbation module that translates microstimulation parameters into spatial changes to neural activity, (2) topographic models that capture the spatial organization of cortical neurons and thus enable prototyping of stimulation experiments, and (3) a mapping procedure that links model-optimized stimulation sites back to primate cortex. Applying this framework in two macaque monkeys performing a visual recognition task, model-predicted stimulation experiments produced significant in-vivo changes in perceptual choices. Per-site model predictions and monkey behavior were strongly correlated, underscoring the promise of model-guided stimulation. Image generation further revealed a qualitative similarity between in-silico stimulation of face-selective sites and a patient's report of facephenes. This proof-of-principle establishes a foundation for model-guided microstimulation and points toward next-generation visual prosthetics capable of inducing more complex visual experiences.
Problem

Research questions and friction points this paper is trying to address.

Developing computational models to guide microstimulation in high-level visual cortex
Overcoming hardware limitations of current visual prosthetic approaches
Enabling complex visual percepts through targeted cortical stimulation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Model-guided microstimulation steers primate visual behavior
Computational framework predicts stimulation effects on cortex
Mapping procedure links model sites to primate cortex
🔎 Similar Papers
No similar papers found.
Johannes Mehrer
Johannes Mehrer
EPFL
B
Ben Lonnqvist
EPFL
A
Anna Mitola
Netherlands Institute for Neuroscience, University of Parma
A
Abdulkadir Gokce
EPFL
P
Paolo Papale
Netherlands Institute for Neuroscience
Martin Schrimpf
Martin Schrimpf
EPFL
NeuroAIComputational NeuroscienceDeep LearningVisionLanguage