INQUIRE-Search: A Framework for Interactive Discovery in Large-Scale Biodiversity Databases

📅 2025-11-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Large biodiversity image databases (e.g., iNaturalist) contain rich ecological semantics—including animal behavior, interspecific interactions, phenology, and habitat information—yet such implicit knowledge remains largely untapped at scale due to coarse-grained metadata and inefficient manual validation. To address this, we propose the first ecology-oriented interactive natural language image search framework, integrating multimodal representation learning with an interpretable query interface. It enables semantic-level content retrieval, uncertainty quantification, and expert-in-the-loop verification. The open-source implementation operates end-to-end without requiring image annotations. Evaluated on five real-world ecological use cases—including phenological monitoring and post-wildfire recovery assessment—the framework achieves 100×–1000× improvements in search efficiency, substantially expanding the scientific utility of large-scale citizen-science image collections.

Technology Category

Application Category

📝 Abstract
Large community science platforms such as iNaturalist contain hundreds of millions of biodiversity images that often capture ecological context on behaviors, interactions, phenology, and habitat. Yet most ecological workflows rely on metadata filtering or manual inspection, leaving this secondary information inaccessible at scale. We introduce INQUIRE-Search, an open-source system that enables scientists to rapidly and interactively search within an ecological image database for specific concepts using natural language, verify and export relevant observations, and utilize this discovered data for novel scientific analysis. Compared to traditional methods, INQUIRE-Search takes a fraction of the time, opening up new possibilities for scientific questions that can be explored. Through five case studies, we show the diversity of scientific applications that a tool like INQUIRE-Search can support, from seasonal variation in behavior across species to forest regrowth after wildfires. These examples demonstrate a new paradigm for interactive, efficient, and scalable scientific discovery that can begin to unlock previously inaccessible scientific value in large-scale biodiversity datasets. Finally, we emphasize using such AI-enabled discovery tools for science call for experts to reframe the priorities of the scientific process and develop novel methods for experiment design, data collection, survey effort, and uncertainty analysis.
Problem

Research questions and friction points this paper is trying to address.

Enables interactive search in biodiversity databases using natural language
Unlocks inaccessible ecological context from large-scale biodiversity images
Supports novel scientific analysis through efficient concept-based discovery
Innovation

Methods, ideas, or system contributions that make the work stand out.

Interactive natural language search in biodiversity databases
Open-source system for rapid ecological image analysis
AI-enabled scalable discovery from large biodiversity datasets
🔎 Similar Papers
No similar papers found.
Edward Vendrow
Edward Vendrow
MIT
J
Julia Chae
Massachusetts Institute of Technology
Rupa Kurinchi-Vendhan
Rupa Kurinchi-Vendhan
Massachusetts Institute of Technology
climate changecomputer visionecology
I
Isaac Eckert
McGill University
J
Jazlynn Hall
Cary Institute of Ecosystem Studies
M
Marta Jarzyna
The Ohio State University
R
Reymond J. Miyajima
The Ohio State University
R
Ruth Oliver
University of California Santa Barbara
L
Laura J. Pollock
McGill University
L
Lauren Schrack
Massachusetts Institute of Technology
S
Scott W. Yanco
Smithsonian’s National Zoo & Conservation Biology Institute
Oisin Mac Aodha
Oisin Mac Aodha
Reader (Associate Professor), University of Edinburgh
Computer VisionMachine LearningMachine TeachingActive LearningConservation Technology
Sara Beery
Sara Beery
Assistant Professor at MIT CSAIL
Computer VisionConservation TechnologyComputational SustainabilityCamera Trapping