Efficient Encrypted Computation in Convolutional Spiking Neural Networks with TFHE

πŸ“… 2026-03-24
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenge of efficiently evaluating continuous non-polynomial functions under fully homomorphic encryption (FHE) by proposing FHE-DiCSNN, the first framework to integrate convolutional spiking neural networks (SNNs) with TFHE. Leveraging the inherent discrete nature of SNNs to align with FHE’s computational constraints, the framework implements spiking neuron models such as Leaky Integrate-and-Fire directly on ciphertexts using bootstrapping. It further enhances inference efficiency through stochastic encoding and parallelized bootstrapping. The approach enables secure evaluation of SNNs of arbitrary depth with less than 3% accuracy degradation on MNIST and Fashion-MNIST, achieves inference times under one second per sample, and demonstrates practical applicability in real-world medical image classification tasks.
πŸ“ Abstract
With the rapid advancement of AI technology, we have seen more and more concerns on data privacy, leading to some cutting-edge research on machine learning with encrypted computation. Fully Homomorphic Encryption (FHE) is a crucial technology for privacy-preserving computation, while it struggles with continuous non-polynomial functions, as it operates on discrete integers and supports only addition and multiplication. Spiking Neural Networks (SNNs), which use discrete spike signals, naturally complement FHE's characteristics. In this paper, we introduce FHE-DiCSNN, a framework built on the TFHE scheme, utilizing the discrete nature of SNNs for secure and efficient computations. By leveraging bootstrapping techniques, we successfully implement Leaky Integrate-and-Fire (LIF) neuron models on ciphertexts, allowing SNNs of arbitrary depth. Our framework is adaptable to other spiking neuron models, offering a novel approach to homomorphic evaluation of SNNs. Additionally, we integrate convolutional methods inspired by CNNs to enhance accuracy and reduce the simulation time associated with random encoding. Parallel computation techniques further accelerate bootstrapping operations. Experimental results on the MNIST and FashionMNIST datasets validate the effectiveness of FHE-DiCSNN, with a loss of less than 3\% compared to plaintext, respectively, and computation times of under 1 second per prediction. We also apply the model into real medical image classification problems and analyze the parameter optimization and selection.
Problem

Research questions and friction points this paper is trying to address.

Encrypted Computation
Convolutional Spiking Neural Networks
Fully Homomorphic Encryption
Privacy-Preserving Computation
TFHE
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fully Homomorphic Encryption
Spiking Neural Networks
TFHE
Convolutional SNN
Bootstrapping
πŸ”Ž Similar Papers
No similar papers found.
L
Longfei Guo
School of Cyber Science and Engineering, Huazhong University of Science and Technology, Wuhan, China
P
Pengbo Li
School of Mathematics and Statistics, Huazhong University of Science and Technology, Wuhan, China
Ting Gao
Ting Gao
Huazhong University of Science and Technology
Stochastic Dynamical SystemDeep LearningBrain ScienceQuantitative Finance
Y
Yonghai Zhong
School of Mathematics and Statistics, Huazhong University of Science and Technology, Wuhan, China
H
Haojie Fan
School of Mathematics and Statistics, Huazhong University of Science and Technology, Wuhan, China
J
Jinqiao Duan
Department of Mathematics and Department of Physics, Great Bay University, Dongguan, China