- Advances and Challenges in Foundation Agents: From Brain-Inspired Intelligence to Evolutionary, Collaborative, and Safe Systems
- VCR: Pixel-Level Complex Reasoning by Restoring Occluded Text
- FACT: Examining the Effectiveness of Iterative Context Rewriting for Multi-fact Retrieval
- Resonance RoPE: Improving Context Length Generalization of Large Language Models
- AlignVLM: Bridging Vision and Language Latent Spaces for Multimodal Understanding
- BigDocs: An Open Dataset for Training Multimodal Models on Document and Code Tasks
- GraphOmni: A Comprehensive and Extendable Benchmark Framework for Large Language Models on Graphs
Research Experience
Research under the supervision of Prof. Bang Liu.
Education
Ph.D.: Computer Science, Mila - Quebec AI Institute / Université de Montréal; B.Eng. (Hons.): Computer Science, Beihang University
Background
Research Interests: Artificial Intelligence, Natural Language Processing, Large Language Models, Long Sequence Modeling, Multimodal Large Models. Bio: Suyuchen Wang is a Ph.D. candidate at Mila, Quebec AI Institute, focusing on advancing Natural Language Processing and Large Language Models. His research includes exploring efficient long sequence modeling techniques and enhancing the performance of LLMs through innovative approaches.
Miscellany
Job Seeking: Looking for Research Engineer or Research Scientist roles specializing in Long Sequence LLMs, Multimodal LLMs, and Agentic LLMs, with availability starting in early 2026.