Scholar
Pooyan Fazli
Google Scholar ID: A35BH40AAAAJ
Arizona State University
Artificial Intelligence
Robotics
Vision and Language
Follow
Homepage
↗
Google Scholar
↗
Citations & Impact
All-time
Citations
479
H-index
11
i10-index
15
Publications
20
Co-authors
16
list available
Contact
Email
pooyan@asu.edu
Twitter
Open ↗
LinkedIn
Open ↗
Publications
14 items
ViDscribe: Multimodal AI for Customizing Audio Description and Question Answering in Online Videos
2026
Cited
0
CASHEW: Stabilizing Multimodal Reasoning via Iterative Trajectory Aggregation
2026
Cited
0
EgoVITA: Learning to Plan and Verify for Egocentric Video Reasoning
2025
Cited
0
FrameOracle: Learning What to See and How Much to See in Videos
2025
Cited
0
AVATAR: Reinforcement Learning to See, Hear, and Reason Over Video
2025
Cited
0
DescribePro: Collaborative Audio Description with Human-AI Interaction
2025
Cited
0
ReGATE: Learning Faster and Better with Fewer Tokens in MLLMs
2025
Cited
0
VideoPASTA: 7K Preference Pairs That Matter for Video-LLM Alignment
2025
Cited
0
Load more
Resume (English only)
Academic Achievements
NASA MPLAN Prize on audio-visual question answering in space medicine (July 2025)
Google AI+X research pathways grant featured on ASU News (December 2024)
Google exploreCSR award ($125,000) (November 2024)
HIRBI Grant ($10,000) (October 2024)
PIT-UN Grant ($25,000) (October 2024)
Launched ViDScribe platform for blind and low vision users (April 2024)
NIH R01 grant (~$3.2M) on automated video description for blind and low vision users (July 2023)
ASU CHART grant ($10,012) on superhuman performance in autonomous robot teaming (SPARTA) (June 2023)
NSF grant (Grand Total: $599,974) for edge-based robust multi-robot systems (August 2022)
ASEE CyBR-MSI grant ($10,000) (April 2022)
Google TensorFlow grant ($6,500) on responsible AI for social impact (November 2021)
RSCA grant ($17,885) on safe autonomous navigation for service robots (October 2021)
Google exploreCSR award ($32,000) on AI fairness and ethics (August 2021)
Amazon grant ($20,000) on human-aware robot navigation (January 2021)
Ability Central Foundation grant ($99,948) on video accessibility (November 2020)
NSF grant ($999,987) to promote diversity in AI workforce (August 2020)
NSF grant ($749,304) for safe and secure autonomous robots (August 2020)
Ability Central Foundation grant ($102,500) on video accessibility (December 2019)
Center for Computing in Life Sciences grant ($6,000) on human-like navigation (February 2019)
Paper 'Online Learning of Human Navigational Intentions' – Best Paper Award Finalist, ICSR 2018
Paper 'Predicting the Target in Human-Robot Manipulation Tasks' – Best Interactive Paper Award Finalist, ICSR 2018
Work featured in ASU News, ABC 15, and Google TensorFlow blog
Background
Assistant Professor at The GAME School and Media and Immersive eXperience (MIX) Center, Arizona State University (ASU)
Director of the People and Robots Laboratory (PeRL)
Affiliated with the Center for Human, Artificial Intelligence, and Robot Teaming (CHART)
Graduate faculty in Computer Science, Robotics and Autonomous Systems, and Media Arts and Sciences
Research interests: Artificial Intelligence, Autonomous Robots, Multi-Robot Systems, Human-Robot Teaming, Robot Learning, Vision and Language, Multimodal Learning, Video Understanding
Co-authors
16 total
Co-author 1
Hasti Seifi
Arizona State University
Mike D'Arcy
Allen Institute for AI
Co-author 4
David Meger
Associate Professor at McGill University
James Little
Professor Emeritus, Computer Science, University of British Columbia
Co-author 7
Co-author 8
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up