Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
Publications: Multiple papers accepted at top conferences such as NeurIPS, ICCV, ICML, SIGGRAPH Asia. Awards: 2025 Google Fellowship in ML and ML Foundations, DARPA Disruptive Idea Paper Award, SIGGRAPH Asia 2024 Best Paper Award, 2025 Stanford Rising Star in Data Science. Other Achievements: Research work covered by MIT News.
Research Experience
Research Projects: Geometric primitives for enhancing language model reasoning, developing theoretically sound neural architectures (e.g., TAPE, GraphKV) and inference algorithms to strengthen the reasoning capabilities of LLMs, analyzing modern neural architectures through a geometric lens (e.g., transformers, SSMs), investigating weight-space geometries to develop principled operators directly manipulating neural network weight matrices (e.g., INSP-Net), integrating physical laws of optics and waves for facilitating 3D reconstruction (e.g., SteepGS, AR-Mirror), generation (e.g., GNT, ESD), foundation models (e.g., LSM, SWIFT), and scientific discovery (e.g., CryoFastAR). Position: PhD student.
Education
Degree: PhD student; University: Department of Electrical and Computer Engineering, The University of Texas at Austin; Advisor: Prof. Atlas Wang; Collaborators: Prof. Pan Li, Prof. Qiang Liu; Year: Fifth year; Major: Electrical and Computer Engineering.
Background
Research Interests: Machine learning and computer vision. Professional Field: Enhancing language model reasoning, developing theoretically sound neural architectures, discovering new scaling paradigms, and grounding generative vision models with physics. Introduction: Advocates for 'full-stack' ML development, fascinated by algorithms that integrate mathematical theory, hardware realization, and real-world/scientific applications across modalities.
Miscellany
Personal Interests: Currently on the job market for 2026.