Rakuten Opens 700B-Parameter Japanese AI Model for Free

March 19, 2026

Rakuten Opens 700B-Parameter Japanese AI Model for Free

Published: March 19, 2026 at 12:39 AM

Updated: March 19, 2026 at 12:39 AM

100-word summary

Rakuten just released a 700-billion-parameter AI model built specifically for Japanese, free under Apache 2.0 on Hugging Face. The Mixture-of-Experts design activates only 40 billion parameters per token, making it surprisingly efficient for its size while outscoring several leading models on Japanese benchmarks (MT-Bench: 8.88). The model handles writing, code generation, and document extraction. Rakuten's bet: most frontier models treat Japanese as an afterthought, leaving room for a locally trained alternative. It's part of Japan's government-backed GENIAC initiative to build regional AI capacity. Translation: you no longer need to fine-tune an English-first model if your users speak Japanese.

What happened

Rakuten just released a 700-billion-parameter AI model built specifically for Japanese, free under Apache 2.0 on Hugging Face. The Mixture-of-Experts design activates only 40 billion parameters per token, making it surprisingly efficient for its size while outscoring several leading models on Japanese benchmarks (MT-Bench: 8.88). The model handles writing, code generation, and document extraction. Rakuten's bet: most frontier models treat Japanese as an afterthought, leaving room for a locally trained alternative. It's part of Japan's government-backed GENIAC initiative to build regional AI capacity.

Why it matters

Translation: you no longer need to fine-tune an English-first model if your users speak Japanese.

Sources