Optimizing Blood Transfusions and Predicting Shortages in Resource-Constrained Areas

📅 2025-06-14
🏛️ BIOSTEC : HEALTHINF
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address inefficient blood transfusion management and delayed shortage alerts in resource-constrained regions, this paper proposes a proactive blood resource management system integrating heuristic scheduling with lightweight machine learning. Methodologically: (1) we design a novel multi-tiered heuristic matching framework that jointly optimizes geographic proximity, ABO/Rh compatibility, expiration priority, and rare-blood-type weighting for precise supply–demand alignment; (2) we develop a deployable hybrid forecasting model—primarily linear regression, augmented by LSTM and ARIMA—and integrate it with a Cassandra time-series database and simulation-based validation. Experimental results demonstrate a 47.6% improvement in request acceptance rate and a low average absolute percentage error of 1.40% for shortage prediction. The system achieves real-time responsiveness, horizontal scalability, and edge-deployment capability, offering a highly practical technical pathway for grassroots blood governance.

Technology Category

Application Category

📝 Abstract
Our research addresses the critical challenge of managing blood transfusions and optimizing allocation in resource-constrained regions. We present heuristic matching algorithms for donor-patient and blood bank selection, alongside machine learning methods to analyze blood transfusion acceptance data and predict potential shortages. We developed simulations to optimize blood bank operations, progressing from random allocation to a system incorporating proximity-based selection, blood type compatibility, expiration prioritization, and rarity scores. Moving from blind matching to a heuristic-based approach yielded a 28.6% marginal improvement in blood request acceptance, while a multi-level heuristic matching resulted in a 47.6% improvement. For shortage prediction, we compared Long Short-Term Memory (LSTM) networks, Linear Regression, and AutoRegressive Integrated Moving Average (ARIMA) models, trained on 170 days of historical data. Linear Regression slightly outperformed others with a 1.40% average absolute percentage difference in predictions. Our solution leverages a Cassandra NoSQL database, integrating heuristic optimization and shortage prediction to proactively manage blood resources. This scalable approach, designed for resource-constrained environments, considers factors such as proximity, blood type compatibility, inventory expiration, and rarity. Future developments will incorporate real-world data and additional variables to improve prediction accuracy and optimization performance.
Problem

Research questions and friction points this paper is trying to address.

Optimizing blood transfusion allocation in resource-limited regions
Predicting blood shortages using machine learning models
Improving blood bank operations with heuristic matching algorithms
Innovation

Methods, ideas, or system contributions that make the work stand out.

Heuristic algorithms optimize donor-patient matching
Machine learning predicts blood shortage risks
NoSQL database integrates optimization and prediction
🔎 Similar Papers
No similar papers found.
E
El Arbi Belfarsi
Kennesaw State University, 1100 South Marietta Pkwy SE, Marietta, GA 30060
S
Sophie Brubaker
Kennesaw State University, 1100 South Marietta Pkwy SE, Marietta, GA 30060
Maria Valero
Maria Valero
Associate Professor, Kennesaw State University
IoT for security and healthCPSdistributed computingmachine learning and