Scholar
Sagnik Bhattacharya
Google Scholar ID: xNb5T5IAAAAJ
ML Ph.D. Student, Stanford University
Deep Generative Modeling
Model Compression
Inference Efficiency
Follow
Google Scholar
↗
Citations & Impact
All-time
Citations
101
H-index
6
i10-index
4
Publications
20
Co-authors
22
list available
Contact
No contact links provided.
Publications
10 items
On the Fundamental Limits of LLMs at Scale
2025
Cited
0
Transformer-Based Sparse CSI Estimation for Non-Stationary Channels
2025
Cited
0
minPIC: Towards Optimal Power Allocation in Multi-User Interference Channels
2025
Cited
0
AI Enabled 6G for Semantic Metaverse: Prospects, Challenges and Solutions for Future Wireless VR
2025
Cited
0
ItDPDM: Information-Theoretic Discrete Poisson Diffusion Model
2025
Cited
0
LZMidi: Compression-Based Symbolic Music Generation
2025
Cited
0
Retrieval Augmented Generation with Multi-Modal LLM Framework for Wireless Environments
2025
Cited
0
An Information-Theoretic Efficient Capacity Region for Multi-User Interference Channel
2025
Cited
0
Load more
Resume (English only)
Co-authors
22 total
Muhammad Ahmed Mohsin
Ph.D @ Stanford University
Co-author 2
Ahsan Bilal
Graduate CS @ University of Oklahoma
Co-author 4
Co-author 5
Co-author 6
Hassan Rizwan
PhD University of California, Riverside
Muhammad Umer
Stanford University
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up