PDeepPP:A Deep learning framework with Pretrained Protein language for peptide classification

📅 2025-02-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Experimental identification of post-translational modification (PTM) sites and classification of bioactive peptides (BPs) suffer from high costs, while existing computational methods exhibit poor generalizability and inadequate sequence modeling capacity. Method: We propose a multi-task deep learning framework that integrates pretrained protein language models (e.g., ESM) with a parallel Transformer-CNN architecture. This is the first work to hybridize multi-head self-attention and 1D convolutional layers for peptide functional prediction, enabling unified modeling of 33 heterogeneous PTM/BP tasks within a single model. Contribution/Results: Through large-scale peptide data fine-tuning and joint multi-task training, our model achieves state-of-the-art performance on 25 tasks—outperforming leading tools significantly. It demonstrates high accuracy, strong generalizability across diverse peptide functions, and scalability to new tasks, establishing an efficient computational paradigm for large-scale peptide functional annotation and de novo peptide discovery.

Technology Category

Application Category

📝 Abstract
Protein post-translational modifications (PTMs) and bioactive peptides (BPs) play critical roles in various biological processes and have significant therapeutic potential. However, identifying PTM sites and bioactive peptides through experimental methods is often labor-intensive, costly, and time-consuming. As a result, computational tools, particularly those based on deep learning, have become effective solutions for predicting PTM sites and peptide bioactivity. Despite progress in this field, existing methods still struggle with the complexity of protein sequences and the challenge of requiring high-quality predictions across diverse datasets. To address these issues, we propose a deep learning framework that integrates pretrained protein language models with a neural network combining transformer and CNN for peptide classification. By leveraging the ability of pretrained models to capture complex relationships within protein sequences, combined with the predictive power of parallel networks, our approach improves feature extraction while enhancing prediction accuracy. This framework was applied to multiple tasks involving PTM site and bioactive peptide prediction, utilizing large-scale datasets to enhance the model's robustness. In the comparison across 33 tasks, the model achieved state-of-the-art (SOTA) performance in 25 of them, surpassing existing methods and demonstrating its versatility across different datasets. Our results suggest that this approach provides a scalable and effective solution for large-scale peptide discovery and PTM analysis, paving the way for more efficient peptide classification and functional annotation.
Problem

Research questions and friction points this paper is trying to address.

Predicts PTM sites and bioactive peptides
Leverages pretrained protein language models
Enhances peptide classification accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates pretrained protein language models
Combines transformer and CNN networks
Enhances peptide classification accuracy
🔎 Similar Papers
No similar papers found.
J
Jixiu Zhai
School of Mathematics and Statistics, Lanzhou University, 222 South Tianshui Road, Lanzhou 730000, China
T
Tianchi Lu
Department of Computer Science, City University of Hong Kong, Kowloon, Hong Kong
Haitian Zhong
Haitian Zhong
Institute of Automation, Chinese Academy of Sciences
Large Language ModelsTrustworthy AIAI for Science
Ziyang Xu
Ziyang Xu
The Chinese University of Hong Kong
AI for ScienceBioinformaticsMedical Image Processing
Y
Yuhuan Liu
Cuiying Honors College, Lanzhou University, 222 South Tianshui Road, Lanzhou 730000, China
X
Xueying Wang
Department of Computer Science, City University of Hong Kong (Dongguan), Dongguan 523000, China
D
Dan Huang
Department of Mathematics, Harbin Engineering University, No. 145 Nantong Street, Harbin 150001, China