🤖 AI Summary
This paper addresses the challenge of probabilistic mass transfer from a source to a target distribution in discrete data generation. We propose ECD²G, a novel generative method grounded in circuit theory, which models probability mass transport as steady-state current flow in an electrical circuit. It establishes a rigorous correspondence between probabilistic flows and fundamental circuit laws—Ohm’s and Kirchhoff’s laws—by parameterizing node potentials with neural networks to implicitly learn optimal probability flows and enable sample transport along circuit paths. To our knowledge, this is the first systematic integration of circuit theory into discrete generative modeling, yielding a physically inspired, interpretable, and verifiable paradigm. We provide theoretical guarantees on convergence and correctness of distributional transfer. Conceptual validation experiments demonstrate ECD²G’s effectiveness and feasibility for discrete-space mapping tasks.
📝 Abstract
We propose $ extbf{E}$lectric $ extbf{C}$urrent $ extbf{D}$iscrete $ extbf{D}$ata $ extbf{G}$eneration (ECD$^{2}$G), a pioneering method for data generation in discrete settings that is grounded in electrical engineering theory. Our approach draws an analogy between electric current flow in a circuit and the transfer of probability mass between data distributions. We interpret samples from the source distribution as current input nodes of a circuit and samples from the target distribution as current output nodes. A neural network is then used to learn the electric currents to represent the probability flow in the circuit. To map the source distribution to the target, we sample from the source and transport these samples along the circuit pathways according to the learned currents. This process provably guarantees transfer between data distributions. We present proof-of-concept experiments to illustrate our ECD$^{2}$G method.