Learning Conjecturing from Scratch

📅 2025-03-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses 16,197 challenging sequences from the OEIS requiring joint inductive and arithmetic reasoning. We propose the first neural-symbolic closed-loop framework that autonomously discovers inductive predicates from scratch. Methodologically, it establishes a self-iterative feedback loop: a neural program translator maps each sequence to candidate inductive predicates; Z3 rapidly verifies their provability; and the top candidates are selected based on conciseness and verification efficiency to drive the next round of self-distillation training. Crucially, our approach eliminates reliance on human annotations or predefined templates, replacing supervised learning with provability-driven discovery. Experiments show that our method solves 5,565 problems—outperforming CVC5, Vampire, and Z3 by 145% within 60 seconds—and autonomously discovers numerous novel, mathematically meaningful inductive predicates.

Technology Category

Application Category

📝 Abstract
We develop a self-learning approach for conjecturing of induction predicates on a dataset of 16197 problems derived from the OEIS. These problems are hard for today's SMT and ATP systems because they require a combination of inductive and arithmetical reasoning. Starting from scratch, our approach consists of a feedback loop that iterates between (i) training a neural translator to learn the correspondence between the problems solved so far and the induction predicates useful for them, (ii) using the trained neural system to generate many new induction predicates for the problems, (iii) fast runs of the z3 prover attempting to prove the problems using the generated predicates, (iv) using heuristics such as predicate size and solution speed on the proved problems to choose the best predicates for the next iteration of training. The algorithm discovers on its own many interesting induction predicates, ultimately solving 5565 problems, compared to 2265 problems solved by CVC5, Vampire or Z3 in 60 seconds.
Problem

Research questions and friction points this paper is trying to address.

Develops self-learning for induction predicates on OEIS dataset.
Combines neural translation and Z3 prover for inductive reasoning.
Solves 5565 problems, outperforming CVC5, Vampire, and Z3.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Self-learning approach for induction predicates
Neural translator for problem-predicate correspondence
Heuristic-based predicate selection for iterative training
🔎 Similar Papers
No similar papers found.