Cohere Drops Tiny Aya: Offline AI for 70+ Languages

February 17, 2026

Cohere Drops Tiny Aya: Offline AI for 70+ Languages

Published: February 17, 2026 at 11:36 PM

Updated: February 17, 2026 at 11:36 PM

100-word summary

Cohere just unveiled Tiny Aya, a family of open-weight AI models packing 3.35 billion parameters across 70+ languages, announced at India's AI Summit. The twist? These models run entirely offline on your device, no internet needed. Think real-time translation for Bengali, Hindi, Tamil, and dozens more without data costs or connectivity issues. Regional variants like Tiny Aya Earth (African/West Asian languages) and Fire (South Asian) target specific linguistic needs. Trained on just 64 Nvidia H100 GPUs, the models are available on Hugging Face for researchers and developers. This could democratize AI access in connectivity-poor regions, making powerful language tech genuinely accessible where internet infrastructure lags behind.

What happened

Cohere just unveiled Tiny Aya, a family of open-weight AI models packing 3.35 billion parameters across 70+ languages, announced at India's AI Summit. The twist? These models run entirely offline on your device, no internet needed. Think real-time translation for Bengali, Hindi, Tamil, and dozens more without data costs or connectivity issues. Regional variants like Tiny Aya Earth (African/West Asian languages) and Fire (South Asian) target specific linguistic needs. Trained on just 64 Nvidia H100 GPUs, the models are available on Hugging Face for researchers and developers.

Why it matters

This could democratize AI access in connectivity-poor regions, making powerful language tech genuinely accessible where internet infrastructure lags behind.

Sources