MicroNN: An On-device Disk-resident Updatable Vector Database

📅 2025-04-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of memory constraints, dynamic index updates (insertions/deletions), and hybrid queries (vector approximate nearest neighbor search + structured attribute filtering) on low-resource edge devices, this paper proposes the first lightweight, embeddable, disk-resident, and real-time updatable vector index architecture. Methodologically, it integrates a compact disk-based index design, incremental insertion/deletion mechanisms, multi-condition joint query optimization, and a memory-aware approximate nearest neighbor algorithm. Evaluated on million-scale benchmarks, the system achieves 90% recall using only ~10 MB of RAM, with top-100 retrieval latency under 7 ms. This work is the first to enable high-accuracy, low-latency, and highly dynamic hybrid search on edge devices under extreme memory budgets. It has been successfully deployed across multiple real-world edge scenarios.

Technology Category

Application Category

📝 Abstract
Nearest neighbour search over dense vector collections has important applications in information retrieval, retrieval augmented generation (RAG), and content ranking. Performing efficient search over large vector collections is a well studied problem with many existing approaches and open source implementations. However, most state-of-the-art systems are generally targeted towards scenarios using large servers with an abundance of memory, static vector collections that are not updatable, and nearest neighbour search in isolation of other search criteria. We present Micro Nearest Neighbour (MicroNN), an embedded nearest-neighbour vector search engine designed for scalable similarity search in low-resource environments. MicroNN addresses the problem of on-device vector search for real-world workloads containing updates and hybrid search queries that combine nearest neighbour search with structured attribute filters. In this scenario, memory is highly constrained and disk-efficient index structures and algorithms are required, as well as support for continuous inserts and deletes. MicroNN is an embeddable library that can scale to large vector collections with minimal resources. MicroNN is used in production and powers a wide range of vector search use-cases on-device. MicroNN takes less than 7 ms to retrieve the top-100 nearest neighbours with 90% recall on publicly available million-scale vector benchmark while using ~10 MB of memory.
Problem

Research questions and friction points this paper is trying to address.

Efficient on-device vector search in low-resource environments
Support for real-time updates and hybrid search queries
Disk-efficient index structures for memory-constrained devices
Innovation

Methods, ideas, or system contributions that make the work stand out.

On-device disk-resident vector database
Supports updates and hybrid search queries
Minimal memory usage with high efficiency
🔎 Similar Papers
No similar papers found.