Publications: multiple papers accepted at top conferences such as ICML, EMNLP, NAACL, ACL; Awards: AMIA Best Student Paper Award; Projects: KIVI largely inspired KV Cache quantization in Huggingface and is integrated into Transformers; Participated in Microsoft Accelerating Foundation Models Research program.
Ph.D. in Computer Science, 2022 - 2026 (expected), Rice University, advised by Dr. Xia “Ben” Hu; B.Eng. in Computer Science and Technology, 2017 - 2021, Tsinghua University, minor in Statistics.
Background
Research interests: efficient machine learning algorithms and systems (MLSys) through methods like quantization, sparsity, and re-parameterization, while enhancing system robustness and security. Applications span language, vision, time series, graph, and healthcare domains.
Miscellany
Lived in Beijing for 22 years and in Houston for several years. Seeking full-time research scientist/engineer positions.