EmbeddingGemma: Powerful and Lightweight Text Representations

📅 2025-09-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of balancing performance, parameter count, and inference efficiency in lightweight text embedding models across multilingual, English, and code domains, this paper introduces an open-source embedding model based on the Gemma-3 architecture. Methodologically, we propose a novel integration of encoder-decoder initialization, geometric embedding distillation, spread-out regularization, and multi-checkpoint fusion—enabling efficient knowledge transfer from large language models and structural optimization of the embedding space within <500M parameters. Experiments demonstrate state-of-the-art (SOTA) results on all three MTEB sub-benchmarks (multilingual, English, and code), outperforming baseline models with over twice the parameter count. The model maintains robustness under quantization and dimensional truncation, significantly improving latency and throughput for edge deployment. It thus achieves an exceptional trade-off between cost-effectiveness and strong generalization across diverse modalities.

Technology Category

Application Category

📝 Abstract
We introduce EmbeddingGemma, a new lightweight, open text embedding model based on the Gemma 3 language model family. Our innovative training recipe strategically captures knowledge from larger models via encoder-decoder initialization and geometric embedding distillation. We improve model robustness and expressiveness with a spread-out regularizer, and ensure generalizability by merging checkpoints from varied, optimized mixtures. Evaluated on the Massive Text Embedding Benchmark (MTEB) across multilingual, English, and code domains, EmbeddingGemma (300M) achieves state-of-the-art results. Notably, it outperforms prior top models, both proprietary and open, with fewer than 500M parameters, and provides performance comparable to models double its size, offering an exceptional performance-to-cost ratio. Remarkably, this lead persists when quantizing model weights or truncating embedding outputs. This makes EmbeddingGemma particularly well-suited for low-latency and high-throughput use cases such as on-device applications. We provide ablation studies exploring our key design choices. We release EmbeddingGemma to the community to promote further research.
Problem

Research questions and friction points this paper is trying to address.

Develops lightweight open text embedding model for efficiency
Improves robustness and expressiveness via novel training techniques
Achieves state-of-the-art performance across multilingual and code domains
Innovation

Methods, ideas, or system contributions that make the work stand out.

Encoder-decoder initialization for knowledge transfer
Geometric embedding distillation from larger models
Spread-out regularizer for improved robustness
🔎 Similar Papers
No similar papers found.
Henrique Schechter Vera
Henrique Schechter Vera
Machine Learning Engineer, Google
natural language understandingrepresentation learningmachine learningartificial intelligence
Sahil Dua
Sahil Dua
Google DeepMind
Large Language ModelsNatural Language ProcessingRepresentation LearningEmbeddings
B
Biao Zhang
EmbeddingGemma Team, Google
D
Daniel M. Salz
EmbeddingGemma Team, Google
R
Ryan Mullins
EmbeddingGemma Team, Google
S
S. Panyam
EmbeddingGemma Team, Google
S
Sara Smoot
EmbeddingGemma Team, Google
Iftekhar Naim
Iftekhar Naim
Google
Machine LearningData MiningNatural Language ProcessingBioinformatics
J
Joe Zou
EmbeddingGemma Team, Google
F
Feiyang Chen
EmbeddingGemma Team, Google
Daniel Cer
Daniel Cer
Research Scientist, Google DeepMind
Natural Language Processing (NLP)Natural Language Understanding (NLU)Deep LearningMachine
A
Alice Lisak
EmbeddingGemma Team, Google
M
Min Choi
EmbeddingGemma Team, Google
L
Lucas Gonzalez
EmbeddingGemma Team, Google
O
Omar Sanseviero
EmbeddingGemma Team, Google
G
Glenn Cameron
EmbeddingGemma Team, Google
I
Ian Ballantyne
EmbeddingGemma Team, Google
K
Kat Black
EmbeddingGemma Team, Google
K
Kaifeng Chen
EmbeddingGemma Team, Google
W
Weiyi Wang
EmbeddingGemma Team, Google
Z
Zhe Li
EmbeddingGemma Team, Google
G
Gus Martins
EmbeddingGemma Team, Google
Jinhyuk Lee
Jinhyuk Lee
Google DeepMind
Natural Language ProcessingMachine Learning
M
Mark Sherwood
EmbeddingGemma Team, Google
J
Juyeong Ji
EmbeddingGemma Team, Google
Renjie Wu
Renjie Wu
University of California, Riverside
Time Series AnalysisData MiningMachine Learning
Jingxiao Zheng
Jingxiao Zheng
EmbeddingGemma Team, Google
J
Jyotinder Singh
EmbeddingGemma Team, Google
Abheesht Sharma
Abheesht Sharma
Machine Learning, Keras, Google
Computational LinguisticsAd Fraud DetectionComputer VisionDeep LearningMachine Learning
D
Divya Sreepat
EmbeddingGemma Team, Google
A
Aashi Jain
EmbeddingGemma Team, Google
A
Adham Elarabawy
EmbeddingGemma Team, Google
A
AJ Co
EmbeddingGemma Team, Google
Andreas Doumanoglou
Andreas Doumanoglou
Google
AI
B
Babak Samari
EmbeddingGemma Team, Google
B
Ben Hora
EmbeddingGemma Team, Google
B
B. Potetz
EmbeddingGemma Team, Google
Dahun Kim
Dahun Kim
Research Scientist, Google DeepMind
Enrique Alfonseca
Enrique Alfonseca
Research Manager, Google Inc. (US)
Natural Language Processing
Fedor Moiseev
Fedor Moiseev
Google
deep learningnatural language processing
F
Feng Han
EmbeddingGemma Team, Google
F
Frank Palma Gomez
EmbeddingGemma Team, Google
G
Gustavo Hernández Abrego
EmbeddingGemma Team, Google
H
Hesen Zhang
EmbeddingGemma Team, Google
H
Hui Hui
EmbeddingGemma Team, Google
J
Jay Han
EmbeddingGemma Team, Google
K
Karan Gill
EmbeddingGemma Team, Google
K
Ke Chen
EmbeddingGemma Team, Google
K
Koert Chen
EmbeddingGemma Team, Google
Madhuri Shanbhogue
Madhuri Shanbhogue
Masters student at Georgia Institute of Technology
Artificial IntelligenceComputer VisionAdversarial Machine Learning
Michael Boratko
Michael Boratko
Research Scientist, Google
Machine LearningArtificial IntelligenceOptimizationGeometric Embeddings
P
P. Suganthan
EmbeddingGemma Team, Google
Sai Meher Karthik Duddu
Sai Meher Karthik Duddu
Google Research
Machine Learning
S
Sandeep Mariserla
EmbeddingGemma Team, Google
S
Setareh Ariafar
EmbeddingGemma Team, Google
S
Shanfeng Zhang
EmbeddingGemma Team, Google
S
Shijie Zhang
EmbeddingGemma Team, Google
S
Simon Baumgartner
EmbeddingGemma Team, Google
S
Sonam Goenka
EmbeddingGemma Team, Google
S
Steve Qiu
EmbeddingGemma Team, Google
T
T. Dabral
EmbeddingGemma Team, Google
T
Trevor Walker
EmbeddingGemma Team, Google
V
Vikram Rao
EmbeddingGemma Team, Google
W
Waleed Khawaja
EmbeddingGemma Team, Google
W
Wenlei Zhou
EmbeddingGemma Team, Google
Xiaoqi Ren
Xiaoqi Ren
Google
LLM
Y
Ye Xia
EmbeddingGemma Team, Google
Y
Yichang Chen
EmbeddingGemma Team, Google
Y
Yi-Ting Chen
EmbeddingGemma Team, Google
Zhe Dong
Zhe Dong
Microsoft AI
Z
Zhongli Ding
EmbeddingGemma Team, Google
Francesco Visin
Francesco Visin
Senior Research Scientist at Google DeepMind
Model based Reinforcement Learning
G
Gael Liu
EmbeddingGemma Team, Google
J
Jiageng Zhang
EmbeddingGemma Team, Google
K
Kathleen Kenealy
EmbeddingGemma Team, Google
M
Michelle Casbon
EmbeddingGemma Team, Google
Ravin Kumar
Ravin Kumar
Bachelor of Technology in Computer Science and Engineering
artificial intelligencedeep learningalgorithm designeconomicsmathematics
Thomas Mesnard
Thomas Mesnard
Research Scientist at Google DeepMind
LLMReinforcement LearningArtificial Intelligence
Z
Zach Gleicher
EmbeddingGemma Team, Google
C
C. Brick
EmbeddingGemma Team, Google
Olivier Lacombe
Olivier Lacombe
EmbeddingGemma Team, Google
Adam Roberts
Adam Roberts
Google DeepMind
Machine LearningMusic GenerationComputer ScienceComputational Biology
Y
Yunhsuan Sung
EmbeddingGemma Team, Google
R
Raphael Hoffmann
EmbeddingGemma Team, Google
T
Tris Warkentin
EmbeddingGemma Team, Google
Armand Joulin
Armand Joulin
Google DeepMind
Machine Learning
T
Tom Duerig
EmbeddingGemma Team, Google
Mojtaba Seyedhosseini
Mojtaba Seyedhosseini
Google
Machine LearningComputer VisionMultimodal Models