OSMGen: Highly Controllable Satellite Image Synthesis using OpenStreetMap Data

📅 2025-10-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Urban planning and environmental management are hindered by outdated geospatial data and scarce high-quality annotations, limiting automated monitoring and simulation-based scenario analysis. To address this, we propose the first end-to-end framework for generating high-fidelity satellite imagery directly from OpenStreetMap (OSM) JSON inputs. Our method enables multi-dimensional controllable synthesis—leveraging vector geometry, semantic labels, geographic coordinates, and timestamps—and uniquely produces temporally consistent before-and-after image pairs. By tightly integrating OSM’s structured geospatial representations with generative models, the framework supports localized scene editing and rapid visualization of urban planning proposals. We further construct a large-scale, high-quality paired dataset (OSM JSON + corresponding satellite images), which substantially alleviates training data scarcity and class imbalance. This dataset facilitates closed-loop updating between remote sensing imagery and vector maps, establishing a scalable, data-driven foundation for dynamic urban monitoring and intelligent planning.

Technology Category

Application Category

📝 Abstract
Accurate and up-to-date geospatial data are essential for urban planning, infrastructure monitoring, and environmental management. Yet, automating urban monitoring remains difficult because curated datasets of specific urban features and their changes are scarce. We introduce OSMGen, a generative framework that creates realistic satellite imagery directly from raw OpenStreetMap (OSM) data. Unlike prior work that relies on raster tiles, OSMGen uses the full richness of OSM JSON, including vector geometries, semantic tags, location, and time, giving fine-grained control over how scenes are generated. A central feature of the framework is the ability to produce consistent before-after image pairs: user edits to OSM inputs translate into targeted visual changes, while the rest of the scene is preserved. This makes it possible to generate training data that addresses scarcity and class imbalance, and to give planners a simple way to preview proposed interventions by editing map data. More broadly, OSMGen produces paired (JSON, image) data for both static and changed states, paving the way toward a closed-loop system where satellite imagery can automatically drive structured OSM updates. Source code is available at https://github.com/amir-zsh/OSMGen.
Problem

Research questions and friction points this paper is trying to address.

Generating realistic satellite imagery from OpenStreetMap data
Addressing scarcity of curated urban feature datasets for monitoring
Producing consistent before-after image pairs for urban changes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generates satellite imagery from raw OpenStreetMap data
Uses vector geometries and semantic tags for control
Produces consistent before-after image pairs from edits
🔎 Similar Papers
No similar papers found.