Alkaid: Resilience to Edit Errors in Provably Secure Steganography via Distance-Constrained Encoding

📅 2026-03-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the lack of robustness in existing provably secure steganographic systems against edit errors—such as insertions, deletions, and substitutions—which stems from their reliance on exact synchronization and absence of error-correction mechanisms. The paper proposes the first scheme that directly embeds edit-distance constraints into provably secure steganographic encoding by ensuring that codewords corresponding to distinct messages maintain a minimum edit distance. Violating codewords are merged to enable deterministic error tolerance, thereby enhancing robustness while preserving information-theoretic security. Coupled with an efficient block-wise batched implementation, the method achieves decoding success rates of 99%–100% across various error channels, an embedding capacity of 0.2 bits per token, and an encoding rate of 6.72 bits per second, significantly outperforming the current state-of-the-art.

Technology Category

Application Category

📝 Abstract
While provably secure steganography provides strong concealment by ensuring stego carriers are indistinguishable from natural samples, such systems remain vulnerable to real-world edit errors (e.g., insertions, deletions, substitutions) because their decoding depends on perfect synchronization and lacks error-correcting capability. To bridge this gap, we propose Alkaid, a provably secure steganographic scheme resilient to edit errors via distance-constrained encoding. The key innovation integrates the minimum distance decoding principle directly into the encoding process by enforcing a strict lower bound on the edit distance between codewords of different messages. Specifically, if two candidate codewords violate this bound, they are merged to represent the same message, thereby guaranteeing reliable recovery. While maintaining provable security, we theoretically prove that Alkaid offers deterministic robustness against bounded errors. To implement this scheme efficiently, we adopt block-wise and batch processing. Extensive experiments demonstrate that Alkaid achieves decoding success rates of 99\% to 100\% across diverse error channels, delivers a payload of 0.2 bits per token for high embedding capacity, and maintains an encoding speed of 6.72 bits per second, significantly surpassing state-of-the-art (SOTA) methods in robustness, capacity, and efficiency.
Problem

Research questions and friction points this paper is trying to address.

steganography
edit errors
error resilience
provable security
synchronization
Innovation

Methods, ideas, or system contributions that make the work stand out.

provably secure steganography
edit error resilience
distance-constrained encoding
minimum edit distance
deterministic robustness
🔎 Similar Papers
No similar papers found.
Z
Zhihan Cao
Shanghai Jiao Tong University, Shanghai, China
Gaolei Li
Gaolei Li
Shanghai Jiao Tong University
Cyber CecurityArtificial Intelligence SecuritySemantic Communication Security
J
Jun Wu
Shanghai Jiao Tong University, Shanghai, China
J
Jianhua Li
Shanghai Jiao Tong University, Shanghai, China
H
Hang Zhang
Cornell University, Ithaca, NY, USA
Mingzhe Chen
Mingzhe Chen
Assistant Professor, Electrical and Computer Engineering Department, University of Miami
Machine learningdigital network twinsunmanned aerial vehiclessemantic communications.