CogFormer: Learn All Your Models Once

📅 2026-03-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional simulation-based inference methods in cognitive modeling struggle to efficiently adapt to changes in model structure, parameters, priors, or experimental design, often requiring retraining for each adjustment and thereby forfeiting the benefits of amortization. To address this limitation, this work proposes CogFormer, a meta-amortized framework built upon the Transformer architecture that, after a single training phase, enables general-purpose parameter inference across a vast family of structurally similar cognitive models. CogFormer achieves unprecedented scalability by supporting flexible inference across diverse data types, parameter configurations, design matrices, and sample sizes. It delivers highly accurate parameter estimates for binary, multi-alternative, and continuous-response decision models while incurring minimal amortization overhead, substantially accelerating the cognitive modeling workflow.

Technology Category

Application Category

📝 Abstract
Simulation-based inference (SBI) with neural networks has accelerated and transformed cognitive modeling workflows. SBI enables modelers to fit complex models that were previously difficult or impossible to estimate, while also allowing rapid estimation across large numbers of datasets. However, the utility of SBI for iterating over varying modeling assumptions remains limited: changing parameterizations, generative functions, priors, and design variables all necessitate model retraining and hence diminish the benefits of amortization. To address these issues, we pilot a meta-amortized framework for cognitive modeling which we nickname the CogFormer. Our framework trains a transformer-based architecture that remains valid across a combinatorial number of structurally similar models, allowing for changing data types, parameters, design matrices, and sample sizes. We present promising quantitative results across families of decision-making models for binary, multi-alternative, and continuous responses. Our evaluation suggests that CogFormer can accurately estimate parameters across model families with a minimal amortization offset, making it a potentially powerful engine that catalyzes cognitive modeling workflows.
Problem

Research questions and friction points this paper is trying to address.

simulation-based inference
cognitive modeling
amortization
model retraining
transformer
Innovation

Methods, ideas, or system contributions that make the work stand out.

meta-amortization
transformer-based architecture
simulation-based inference
cognitive modeling
model generalization
🔎 Similar Papers
No similar papers found.
J
Jerry M. Huang
Center for Modeling, Simulation, and Imaging in Medicine (CeMSIM), Rensselaer Polytechnic Institute, Troy, New York, United States
Lukas Schumacher
Lukas Schumacher
University of Basel
Decision-makingCognitive modelingBayesian statistics
N
Niek Stevenson
Amsterdam Mathematical Psychology Lab, University of Amsterdam, Amsterdam, Netherlands
Stefan T. Radev
Stefan T. Radev
Assistant Professor, Rensselaer Polytechnic Institute
Deep LearningBayesian StatisticsStochastic ModelsMachine LearningCognitive Modeling