🤖 AI Summary
Early identification of diabetic foot ulcers (DFUs) is critical to reducing amputation risk and treatment costs; however, existing diagnostic tools lack point-of-care capability, multiplex biomarker detection, and cost-effectiveness. To address this, we developed a portable sensing system based on a 3D-printed microfluidic swab that enables simultaneous wound exudate collection and on-site quantification of multiple DFU biomarkers. We introduce a novel, calibration-free computer vision method—dual-image density contrast analysis—that eliminates interference from illumination variations and device-specific artifacts. The system integrates an iOS-based edge intelligence framework for real-time, automated classification of non-healing DFUs and dynamic tracking of healing trajectories. Validated in clinical settings, the platform achieves significantly improved early diagnostic accuracy compared to conventional methods. It offers a practical, scalable solution for DFU monitoring at primary care and bedside settings, bridging a critical gap in accessible, quantitative wound management.
📝 Abstract
Diabetic foot ulcers (DFUs), a class of chronic wounds, affect ~750,000 individuals every year in the US alone and identifying non-healing DFUs that develop to chronic wounds early can drastically reduce treatment costs and minimize risks of amputation. There is therefore a pressing need for diagnostic tools that can detect non-healing DFUs early. We develop a low cost, multi-analyte 3D printed assays seamlessly integrated on swabs that can identify non-healing DFUs and a Wound Sensor iOS App - an innovative mobile application developed for the controlled acquisition and automated analysis of wound sensor data. By comparing both the original base image (before exposure to the wound) and the wound-exposed image, we developed automated computer vision techniques to compare density changes between the two assay images, which allow us to automatically determine the severity of the wound. The iOS app ensures accurate data collection and presents actionable insights, despite challenges such as variations in camera configurations and ambient conditions. The proposed integrated sensor and iOS app will allow healthcare professionals to monitor wound conditions real-time, track healing progress, and assess critical parameters related to wound care.