Google releases TranslateGemma open models supporting 55 languages
Google has launched TranslateGemma, a collection of open translation models built on Gemma 3, aimed at delivering high-quality multilingual performance across a range of devices and deployment environments.
Google said TranslateGemma supports 55 languages and is available in 4B, 12B and 27B parameter sizes, offering developers and researchers more efficient options for machine translation without sacrificing quality.
The company said the models were created by distilling knowledge from its most advanced large-scale models into smaller, open models, improving efficiency while maintaining strong translation performance.
In technical evaluations, Google said the 12B TranslateGemma model outperformed the larger Gemma 3 27B baseline on MetricX using the WMT24++ benchmark. The 4B model was also reported to rival the performance of the 12B baseline despite its smaller size.
Google said training followed a two-stage process, beginning with supervised fine-tuning on parallel datasets that included human translations and high-quality synthetic translations generated by Gemini models. This was followed by a reinforcement learning phase guided by an ensemble of reward models, including MetricX-QE and AutoMQM.
The company added that TranslateGemma retains Gemma 3’s multimodal capabilities and demonstrated improved performance on the Vistra image translation benchmark, despite not undergoing dedicated multimodal fine-tuning.
Related reading
- OpenAI opens RFP to expand US manufacturing for AI infrastructure
- Perplexity offers Enterprise Pro to law enforcement and public safety agencies
- Microsoft launches Elevate for Educators with free AI tools and training
Google said the models are designed for a wide range of deployment scenarios. The 4B model is optimised for mobile and edge environments, the 12B model for consumer laptops, and the 27B model for cloud deployment on a single H100 GPU or TPU.
Researchers can access TranslateGemma through the technical report as well as platforms including Kaggle, Hugging Face, the Gemma Cookbook and Vertex AI, the company said.
The Recap
- Open translation models built on Gemma 3 released.
- Three sizes available: 4B, 12B, and 27B parameters.
- Technical evaluation shows 12B outperforms a 27B baseline.