Revisiting Non-Acyclic GFlowNets in Discrete Environments

📅 2025-02-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the inherent limitation of GFlowNets—namely, their strict reliance on directed acyclic graphs (DAGs) in discrete settings—by proposing the first *acyclic* GFlowNet framework applicable to **general directed graphs (including cyclic ones)**. Methodologically, it abandons the DAG constraint and reconstructs the theoretical foundation upon generalized flow conservation; reveals the intrinsic relationship between fixed backward policies and the flow function; establishes formal equivalence to entropy-regularized reinforcement learning; and devises a training objective and stability analysis toolkit tailored to arbitrary graph structures. Theoretically, it unifies and generalizes prior GFlowNet results. Empirically, it demonstrates convergence and robustness of the proposed loss function, providing a scalable foundation for modeling complex discrete structures—such as cyclic molecules in chemistry or program graphs—in downstream applications.

Technology Category

Application Category

📝 Abstract
Generative Flow Networks (GFlowNets) are a family of generative models that learn to sample objects from a given probability distribution, potentially known up to a normalizing constant. Instead of working in the object space, GFlowNets proceed by sampling trajectories in an appropriately constructed directed acyclic graph environment, greatly relying on the acyclicity of the graph. In our paper, we revisit the theory that relaxes the acyclicity assumption and present a simpler theoretical framework for non-acyclic GFlowNets in discrete environments. Moreover, we provide various novel theoretical insights related to training with fixed backward policies, the nature of flow functions, and connections between entropy-regularized RL and non-acyclic GFlowNets, which naturally generalize the respective concepts and theoretical results from the acyclic setting. In addition, we experimentally re-examine the concept of loss stability in non-acyclic GFlowNet training, as well as validate our own theoretical findings.
Problem

Research questions and friction points this paper is trying to address.

Relaxing acyclicity assumption in GFlowNets
Simplifying theory for non-acyclic GFlowNets
Exploring loss stability in non-acyclic training
Innovation

Methods, ideas, or system contributions that make the work stand out.

Relaxes acyclicity assumption
Simplifies non-acyclic GFlowNets framework
Explores entropy-regularized RL connections
🔎 Similar Papers
No similar papers found.