Published multiple papers and delivered talks at various international conferences, such as an invited talk at Nvidia on 'Adaptive Inference in Pretrained LLMs,' a spotlight talk at the Conference on Language Modeling (COLM) about 'Adaptive Layer-skipping in Pre-trained LLMs.' The training dataset of the multimodal language model Open-Qwen2VL has been downloaded over 90,000 times on HuggingFace.
Research Experience
Professor at the Computer Science Department, University of California, Santa Barbara, focusing on foundation models in AI, graph mining, graph data management, and transformer-based time series forecasting.
Education
Ph.D. (2006), University of Illinois at Urbana-Champaign
Background
Research Interests: Foundation models in artificial intelligence, knowledge discovery, and cross-disciplinary applications (e.g., finance, healthcare, and science). Innovated extensively on graph mining, graph data management, and transformer-based time series forecasting. Co-inventor of ADL/Mica, an agent-first approach to conversational AI assistants.
Miscellany
Personal interests: Attended the 6th International Conference on Data-Driven Plasma Science in Santa Fe, noting that reading legendary stories from Los Alamos and articles about complexity science at the Santa Fe Institute during childhood influenced his career interests.