Scholar
Zhanpeng Zeng
Google Scholar ID: P9ctuRUAAAAJ
University of Wisconsin Madison
Transformer Efficiency
Follow
Google Scholar
↗
Citations & Impact
All-time
Citations
805
H-index
8
i10-index
8
Publications
16
Co-authors
0
Contact
No contact links provided.
Publications
6 items
Efficiently Aligning Draft Models via Parameter- and Data-Efficient Adaptation
2026
Cited
0
RQ-GMM: Residual Quantized Gaussian Mixture Model for Multimodal Semantic Discretization in CTR Prediction
2026
Cited
0
Distribution-Aware End-to-End Embedding for Streaming Numerical Features in Click-Through Rate Prediction
2026
Cited
0
Speculative Decoding Reimagined for Multimodal Large Language Models
2025
Cited
0
A Light and Tuning-free Method for Simulating Camera Motion in Video Generation
2025
Cited
0
Semantics Prompting Data-Free Quantization for Low-Bit Vision Transformers
2024
Cited
0
Resume (English only)
Co-authors
0 total
Co-authors: 0 (list not available)
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up