Dale meets Langevin: A Multiplicative Denoising Diffusion Model

📅 2025-10-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses biologically plausible generative modeling by proposing the first multiplicative denoising diffusion framework strictly adhering to Dale’s law—where excitatory and inhibitory synaptic roles are non-interchangeable. Methodologically, it employs geometric Brownian motion as the forward process and constructs a multiplicative reverse stochastic differential equation (SDE) under a log-normal prior; leveraging multiplicative score matching and exponential gradient descent, it derives biologically plausible update rules satisfying Dale’s constraint—marking the first integration of Dale’s law into Langevin dynamics-based generative modeling. Experiments on MNIST, Fashion-MNIST, and Kuzushiji demonstrate high-fidelity image generation. Key contributions are: (1) the first Dale-compliant multiplicative diffusion mechanism; (2) a theoretical foundation for multiplicative score matching tailored to non-negative data; and (3) a novel generative modeling paradigm grounded in computational neuroscience principles.

Technology Category

Application Category

📝 Abstract
Gradient descent has proven to be a powerful and effective technique for optimization in numerous machine learning applications. Recent advances in computational neuroscience have shown that learning in standard gradient descent optimization formulation is not consistent with learning in biological systems. This has opened up interesting avenues for building biologically inspired learning techniques. One such approach is inspired by Dale's law, which states that inhibitory and excitatory synapses do not swap roles during the course of learning. The resulting exponential gradient descent optimization scheme leads to log-normally distributed synaptic weights. Interestingly, the density that satisfies the Fokker-Planck equation corresponding to the stochastic differential equation (SDE) with geometric Brownian motion (GBM) is the log-normal density. Leveraging this connection, we start with the SDE governing geometric Brownian motion, and show that discretizing the corresponding reverse-time SDE yields a multiplicative update rule, which surprisingly, coincides with the sampling equivalent of the exponential gradient descent update founded on Dale's law. Furthermore, we propose a new formalism for multiplicative denoising score-matching, subsuming the loss function proposed by Hyvaerinen for non-negative data. Indeed, log-normally distributed data is positive and the proposed score-matching formalism turns out to be a natural fit. This allows for training of score-based models for image data and results in a novel multiplicative update scheme for sample generation starting from a log-normal density. Experimental results on MNIST, Fashion MNIST, and Kuzushiji datasets demonstrate generative capability of the new scheme. To the best of our knowledge, this is the first instance of a biologically inspired generative model employing multiplicative updates, founded on geometric Brownian motion.
Problem

Research questions and friction points this paper is trying to address.

Develops biologically inspired generative models using multiplicative updates
Connects geometric Brownian motion with exponential gradient descent optimization
Proposes multiplicative denoising score-matching for log-normal data distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multiplicative denoising diffusion model inspired by Dale's law
Geometric Brownian motion discretization yields multiplicative updates
Score-matching formalism for log-normally distributed data generation
🔎 Similar Papers
No similar papers found.
N
Nishanth Shetty
Department of Electrical Engineering, Indian Institute of Science, Bengaluru 560012
M
Madhava Prasath
Department of Electrical Engineering, Indian Institute of Science, Bengaluru 560012
Chandra Sekhar Seelamantula
Chandra Sekhar Seelamantula
Indian Institute of Science, Bangalore
Sparse Signal/Image Processing/ImagingSampling TheoriesWavelets and SplinesSpeech/Audio ProcessingMachine Learning