Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
IJCAI 2024: Fast and Continual Knowledge Graph Embedding via Incremental LoRA.
AAAI 2024, Oral: Towards Continual Knowledge Graph Embedding via Incremental Distillation.
AAAI 2023: IterDE: An Iterative Knowledge Distillation Framework for Knowledge Graph Embeddings.
IJCAI 2025: Can Large Models Teach Student Models to Solve Mathematical Problems Like Human Beings? A Reasoning Distillation Method via Multi-LoRA Interaction.
TKDE 2025: Learning Multi-Granularity and Adaptive Representation for Knowledge Graph Reasoning.
ACL 2025, Oral: Acquisition and Application of Novel Knowledge in Large Language Models.
ACL 2025, Oral: LLM-Guided Semantic-Aware Clustering for Topic Modeling.
ACL 2025 Findings: On the Consistency of Commonsense in Large Language Models.
NeurIPS 2024: Unveiling LoRA Intrinsic Ranks via Salience Analysis.
ACL 2024 Findings: Boosting Textural NER with Synthetic Image and Instructive Alignment.
IJCAI 2024: Domain-Hierarchy Adaptation via Chain of Iterative Reasoning for Few-shot Hierarchical Text Classification.
IJCAI 2024: Meta In-Context Learning Makes Large Language Models Better Zero and Few-Shot Relation Extractors.
IJCAI 2024: Recall, Retrieve and Reason: Towards Better In-Context Relation Extraction.
Research Experience
Focused on knowledge distillation and model compression for LLMs.
Background
Currently a PhD student at the College of Computer Science and Engineering, Southeast University, supervised by Prof. Peng Wang. Research interests include knowledge distillation (KD), large language models (LLMs), and knowledge graphs (KGs). Received B.Eng. degree from Dalian University of Technology.