๐ค AI Summary
This study addresses the challenge of identifying spatial coexistence and conflict between humans and wildlife in protected areas. We propose a multi-perspective collaborative monitoring framework integrating ground-based visible/near-infrared camera-trap imagery with aerial thermal infrared UAV data. Methodologically, we employ YOLOv11s for species detection in camera-trap images and an enhanced Faster R-CNN for thermal UAV imagery, coupled with spatial hotspot analysis and activity overlap modeling. This enables, for the first time at the landscape scale, synchronous mapping of human and wildlife activity spaces and precise identification of conflict zones. Validated in Chitwan National Park, Nepal, the system achieves a detection mAP50 of 96.7%, successfully revealing activity hotspots and spatial overlap patterns. Our approach overcomes the limitations of single-platform monitoring, delivering a scalable, automated, multi-source fusion technical paradigm for early-warning risk assessment and adaptive management of humanโwildlife conflict in protected areas.
๐ Abstract
Wildlife and human activities are key components of landscape systems. Understanding their spatial distribution is essential for evaluating human wildlife interactions and informing effective conservation planning. Multiperspective monitoring of wildlife and human activities by combining camera traps and drone imagery. Capturing the spatial patterns of their distributions, which allows the identification of the overlap of their activity zones and the assessment of the degree of human wildlife conflict. The study was conducted in Chitwan National Park (CNP), Nepal, and adjacent regions. Images collected by visible and nearinfrared camera traps and thermal infrared drones from February to July 2022 were processed to create training and testing datasets, which were used to build deep learning models to automatic identify wildlife and human activities. Drone collected thermal imagery was used for detecting targets to provide a multiple monitoring perspective. Spatial pattern analysis was performed to identify animal and resident activity hotspots and delineation potential human wildlife conflict zones. Among the deep learning models tested, YOLOv11s achieved the highest performance with a precision of 96.2%, recall of 92.3%, mAP50 of 96.7%, and mAP50 of 81.3%, making it the most effective for detecting objects in camera trap imagery. Drone based thermal imagery, analyzed with an enhanced Faster RCNN model, added a complementary aerial viewpoint for camera trap detections. Spatial pattern analysis identified clear hotspots for both wildlife and human activities and their overlapping patterns within certain areas in the CNP and buffer zones indicating potential conflict. This study reveals human wildlife conflicts within the conserved landscape. Integrating multiperspective monitoring with automated object detection enhances wildlife surveillance and landscape management.