PREBA: Surgical Duration Prediction via PCA-Weighted Retrieval-Augmented LLMs and Bayesian Averaging Aggregation

📅 2026-02-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing approaches to surgical duration prediction either fail to model institution-specific clinical context in zero-shot settings—leading to unstable and clinically implausible large language model (LLM) outputs—or rely on supervised learning with extensive labeled data and high computational costs. To address this, we propose PREBA, a novel framework that integrates PCA-weighted retrieval and Bayesian averaging into the retrieval-augmented generation (RAG) paradigm. PREBA retrieves clinically similar historical cases using PCA-weighted similarity, constructs evidence-enhanced prompts, and fuses multi-round LLM predictions with population-level statistical priors via Bayesian averaging to produce calibrated duration estimates without any model training. Evaluated on two real-world clinical datasets, PREBA reduces mean absolute error by up to 40% compared to zero-shot inference and improves R² from −0.13 to 0.62, achieving accuracy, stability, and clinical plausibility on par with supervised models.

Technology Category

Application Category

📝 Abstract
Accurate prediction of surgical duration is pivotal for hospital resource management. Although recent supervised learning approaches-from machine learning (ML) to fine-tuned large language models (LLMs)-have shown strong performance, they remain constrained by the need for high-quality labeled data and computationally intensive training. In contrast, zero-shot LLM inference offers a promising training-free alternative but it lacks grounding in institution-specific clinical context (e.g., local demographics and case-mix distributions), making its predictions clinically misaligned and prone to instability. To address these limitations, we present PREBA, a retrieval-augmented framework that integrates PCA-weighted retrieval and Bayesian averaging aggregation to ground LLM predictions in institution-specific clinical evidence and statistical priors. The core of PREBA is to construct an evidence-based prompt for the LLM, comprising (1) the most clinically similar historical surgical cases and (2) clinical statistical priors. To achieve this, PREBA first encodes heterogeneous clinical features into a unified representation space enabling systematic retrieval. It then performs PCA-weighted retrieval to identify clinically relevant historical cases, which form the evidence context supplied to the LLM. Finally, PREBA applies Bayesian averaging to fuse multi-round LLM predictions with population-level statistical priors, yielding calibrated and clinically plausible duration estimates. We evaluate PREBA on two real-world clinical datasets using three state-of-the-art LLMs, including Qwen3, DeepSeek-R1, and HuatuoGPT-o1. PREBA significantly improves performance-for instance, reducing MAE by up to 40% and raising R^2 from -0.13 to 0.62 over zero-shot inference-and it achieves accuracy competitive with supervised ML methods, demonstrating strong effectiveness and generalization.
Problem

Research questions and friction points this paper is trying to address.

surgical duration prediction
retrieval-augmented LLMs
institution-specific clinical context
zero-shot inference
clinical alignment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Retrieval-Augmented LLMs
PCA-Weighted Retrieval
Bayesian Averaging
Surgical Duration Prediction
Zero-Shot Inference
🔎 Similar Papers
No similar papers found.
W
Wanyin Wu
School of Information Science and Engineering, Yunnan University, Kunming 650500, China
K
Kanxue Li
School of Computer Science, Wuhan University, Wuhan 430072, China
Baosheng Yu
Baosheng Yu
Assistant Professor, Nanyang Technological University
Machine LearningDeep LearningComputer VisionAI for Medicine
H
Haoyun Zhao
School of Information Science and Engineering, Yunnan University, Kunming 650500, China
Yibing Zhan
Yibing Zhan
Unknown affiliation
Dapeng Tao
Dapeng Tao
Yunnan University
H
Hua Jin
First People's Hospital of Yunnan Province, Kunming 650032, China