Korean Telecom KT Built 32B Model for Four Languages

February 27, 2026

Korean Telecom KT Built 32B Model for Four Languages

Published: February 27, 2026 at 12:35 AM

Updated: February 27, 2026 at 12:35 AM

100-word summary

KT announced it will unveil Mi:dm K 2.5 Pro at Mobile World Congress next week, a 32-billion-parameter model trained on Korean, English, Japanese, and Chinese. The pitch: businesses can analyze hundreds of pages across those four languages in one go, thanks to a 128,000-token context window (enough for a short novel). KT claims it scored 87% on an agent benchmark, meaning the model can call tools and complete tasks, not just answer questions. The telecom giant is betting regional companies want AI that speaks their languages natively, rather than relying on models trained mostly on English and clumsily adapted afterward.

What happened

KT announced it will unveil Mi:dm K 2.5 Pro at Mobile World Congress next week, a 32-billion-parameter model trained on Korean, English, Japanese, and Chinese. The pitch: businesses can analyze hundreds of pages across those four languages in one go, thanks to a 128,000-token context window (enough for a short novel). KT claims it scored 87% on an agent benchmark, meaning the model can call tools and complete tasks, not just answer questions.

Why it matters

The telecom giant is betting regional companies want AI that speaks their languages natively, rather than relying on models trained mostly on English and clumsily adapted afterward.

Sources