June 2025: Gave a talk on 360Brew and its productionization at AI Engineer World's Fair.
April 2025: Presented 360Brew at LinkedIn AI & Data Community event, explaining how it unifies core AI components to streamline ranking/recommendation workflows.
February 2025: Released CoT-ICL Lab, a framework to study Chain-of-Thought (CoT) and In-Context Learning (ICL), revealing how model depth and example count affect performance, with theoretical insights.
February 2025: Published technical report on productionizing 360Brew using On-Policy Knowledge Distillation, Model Compression, and Serving Optimizations, achieving 20x cost/latency reduction while maintaining quality.
December 2024: Released technical report demonstrating that the 150B-parameter 360Brew model solves 30+ personalization tasks on LinkedIn without task-specific fine-tuning or complex feature engineering, with strong out-of-domain generalization.
October 2024: Published findings on the 'LLM Lost-in-Distance' phenomenon, showing performance degradation as relevant information becomes more distant in long contexts.
August 2024: Open-sourced Liger Kernel, a collection of Triton kernels optimized for memory-efficient and fast LLM training.