ConstraintFlow: A Tensor-Based Compiler and a Runtime for Neuron-Level DNN Certifier Specifications, Arxiv, 2025
Syndicate: Synergistic Synthesis of Ranking Function and Invariants for Termination Analysis, Arxiv, 2025
Safety and Trust in Artificial Intelligence with Abstract Interpretation, Foundations and Trends in Programming Languages, 2025
Automated Verification of Soundness of DNN Certifiers, OOPSLA, 2025
ConstraintFlow: A DSL for Specification and Verification of Neural Network Analyses, Static Analysis, 2024
Interpreting Robustness Proofs of Deep Neural Networks, ICLR, 2024
Research Experience
Prior to starting his PhD, he worked with Shraddha Barke and Suman Nath at MSR Redmond, Rahul Sharma at MSR India, and Prof. Nate Foster at Cornell University during his research internships.
Education
Graduated with Bachelors and Masters in Computer Science from IIT Delhi in May 2021, advised by Prof. Sanjiva Prasad.
Background
PhD student in the Computer Science Department at the University of Illinois, Urbana-Champaign. Current research focuses on making Neural Networks trustworthy by verifying properties like robustness using formal methods.
Miscellany
Current projects include developing ConstraintFlow, a declarative DSL for specifying Abstract Interpretation-based DNN certifiers; Syndicate, a novel framework for proving termination of programs; and using Large Language Models to automatically prove theorems.