Pretraining Finnish ModernBERTs

๐Ÿ“… 2025-11-12
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address the limited long-context modeling capability of Finnish and related languages under multilingual resource constraints, this paper pretrains six ModernBERT encoder models of varying scales. Methodologically, we extend the BERT architecture and perform self-supervised pretraining on large-scale Finnish and closely related language corpora, with explicit optimization for sequences exceeding 512 tokens; we further conduct a systematic analysis of end-stage data curation strategies. Our contributions are threefold: (1) the first dedicated, scalable multilingual BERT family specifically designed for the Finno-Ugric language family; (2) consistent and significant performance gains over general-purpose multilingual models (e.g., mBERT, XLM-R) and the monolingual FinBERT on long-text NLP tasksโ€”including question answering and document classification; and (3) full open-sourcing of all models and training code to facilitate research and applications in low-resource, long-context NLP.

Technology Category

Application Category

๐Ÿ“ Abstract
This paper reports on pretraining ModernBERT encoder models in six different sizes, ranging from 51M to 475M parameters, with a focus on limited multilingualism, emphasizing languages relevant to Finland. Our models are competitive with, or superior to, existing multilingual models. They outperform monolingual models on tasks that require a context longer than 512 tokens. We present empirical results on using different data in the final stage of training. The code and models are publicly released.
Problem

Research questions and friction points this paper is trying to address.

Pretraining Finnish-focused multilingual BERT models
Developing competitive models with limited multilingual data
Enhancing performance on long-context language tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pretrained Finnish ModernBERT encoder models
Models range from 51M to 475M parameters
Focus on limited multilingualism for Finland
๐Ÿ”Ž Similar Papers
2024-04-02arXiv.orgCitations: 14
A
Akseli Reunamo
University of Turku, Department of Computing
L
Laura-Maria Peltonen
University of Eastern Finland and Kuopio University Hospital
H
Hans Moen
Aalto University, Department of Computer Science
Sampo Pyysalo
Sampo Pyysalo
University of Turku