By Daniel Dominguez
Publication Date: 2026-01-28 10:16:00
Google has released TranslateGemma, a new suite of open translation models built on the Gemma 3 architecture. The release includes three model sizes 4B, 12B, and 27B parameters, and targets machine translation across 55 languages. The models are designed to run in a range of environments, from mobile and edge devices to consumer hardware and cloud accelerators, and are available as open models for developers and researchers.
TranslateGemma is the result of a training process focused on efficiency and transfer of knowledge from larger proprietary systems. Google used a two-stage approach that combines supervised fine-tuning with reinforcement learning. In the supervised phase, the base Gemma 3 models were trained on parallel datasets composed of both human-produced translations and synthetic translations generated by Gemini models. This mix was intended to increase coverage across language families, including low-resource languages, while maintaining consistency in…

