LeanGeo: Formalizing Competitional Geometry problems in Lean

📅 2025-08-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Geometric theorem proving systems have long suffered from inconsistent formal representations, poor cross-domain interoperability, and overreliance on informal diagrammatic reasoning for verification. To address these challenges, we introduce LeanGeo—a novel, competition-grade formal framework built on Lean 4 and Mathlib—that enables precise encoding and machine-checked verification of higher-order geometric theorems. We release LeanGeo-Bench, an open-source benchmark suite covering representative problems from the International Mathematical Olympiad (IMO) and other high-stakes competitions, marking the first complete formalization of IMO-level geometry problems. This work establishes the first verifiable, extensible, and reusable infrastructure for automated geometric reasoning. Furthermore, it empirically exposes critical limitations of large language models in structured geometric deduction and introduces a new paradigm—along with rigorous evaluation criteria—for synergistic research at the intersection of formal mathematics and AI.

Technology Category

Application Category

📝 Abstract
Geometry problems are a crucial testbed for AI reasoning capabilities. Most existing geometry solving systems cannot express problems within a unified framework, thus are difficult to integrate with other mathematical fields. Besides, since most geometric proofs rely on intuitive diagrams, verifying geometry problems is particularly challenging. To address these gaps, we introduce LeanGeo, a unified formal system for formalizing and solving competition-level geometry problems within the Lean 4 theorem prover. LeanGeo features a comprehensive library of high-level geometric theorems with Lean's foundational logic, enabling rigorous proof verification and seamless integration with Mathlib. We also present LeanGeo-Bench, a formal geometry benchmark in LeanGeo, comprising problems from the International Mathematical Olympiad (IMO) and other advanced sources. Our evaluation demonstrates the capabilities and limitations of state-of-the-art Large Language Models on this benchmark, highlighting the need for further advancements in automated geometric reasoning. We open source the theorem library and the benchmark of LeanGeo at https://github.com/project-numina/LeanGeo/tree/master.
Problem

Research questions and friction points this paper is trying to address.

Formalizing competition-level geometry problems in Lean
Enabling rigorous proof verification for geometry theorems
Integrating geometric reasoning with other mathematical fields
Innovation

Methods, ideas, or system contributions that make the work stand out.

Formalizing geometry problems in Lean 4
Comprehensive library of geometric theorems
Rigorous proof verification with Mathlib integration
🔎 Similar Papers
No similar papers found.
C
Chendong Song
Moonshot AI
Z
Zihan Wang
Peking University
F
Frederick Pu
University of Toronto
Haiming Wang
Haiming Wang
Professor at the School of Information Science and Engineering, Southeast University
Antenna & Radio FrequencyRadio PropagationNonlinear Wireless Communications
X
Xiaohan Lin
Moonshot AI
J
Junqi Liu
Moonshot AI
J
Jia Li
Numina
Z
Zhengying Liu
Moonshot AI