Scholar
Xiaotian Ye
Google Scholar ID: F0tij1oAAAAJ
Beijing University of Posts and Telecommunications
Natural Language Processing
Knowledge Representation
Large Language Models
Follow
Homepage
↗
Google Scholar
↗
Citations & Impact
All-time
Citations
53
H-index
3
i10-index
2
Publications
6
Co-authors
0
Contact
CV
Open ↗
GitHub
Open ↗
LinkedIn
Open ↗
Publications
7 items
Uncovering Context Reliance in Unstructured Knowledge Editing
2026
Cited
0
Spectral Characterization and Mitigation of Sequential Knowledge Editing Collapse
2026
Cited
0
LLM Unlearning Should Be Form-Independent
2025
Cited
0
Disentangling Knowledge Representations for Large Language Model Editing
2025
Cited
0
Open Problems and a Hypothetical Path Forward in LLM Knowledge Paradigms
2025
Cited
0
UIPE: Enhancing LLM Unlearning by Removing Knowledge Related to Forgetting Targets
2025
Cited
0
Uncovering Overfitting in Large Language Model Editing
arXiv.org · 2024
Cited
9
Resume (English only)
Academic Achievements
Paper 'LLM Unlearning Should Be Form-Independent' accepted to IEEE Symposium on Security and Privacy (S&P) 2026
Paper 'Uncovering Overfitting in Large Language Model Editing' accepted as Spotlight at ICLR 2025
Two papers accepted to EMNLP 2025
Paper 'Knowledge Graph Enhanced Large Language Model Editing' accepted to EMNLP 2024 Main Conference
Recipient of CCF Elite Collegiate Award (2025)
Silver medalist at ICPC Asia Regional Contest (2023)
Background
Senior undergraduate student in Computer Science at Beijing University of Posts and Telecommunications (BUPT)
Research intern at NLPR/MAIS, Institute of Automation, Chinese Academy of Sciences (CASIA)
Research focuses on the intersection of knowledge mechanisms and the safety & trustworthiness of foundation models (LLMs/VLMs)
Broadly interested in interpretability, knowledge editing, unlearning, alignment, and agentic safety
Passionate about developing reliable and trustworthy AI systems for real-world applications
Co-authors
0 total
Co-authors: 0 (list not available)
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up