Krux

February 16, 2026
China's Zhipu AI Drops GLM-5: 744B Parameters Challenge OpenAI
Published: February 16, 2026 at 4:47 AM
Updated: February 16, 2026 at 4:47 AM
100-word summary
Zhipu AI just unleashed GLM-5, a massive open-source language model with 744 billion parameters trained on 28.5 trillion tokens. The flagship model doubles its predecessor's size and runs entirely on Huawei chips—no US hardware needed. Using DeepSeek's Mixture-of-Experts architecture, GLM-5 claims industry-leading coding chops, allegedly outperforming Google's Gemini 3 Pro and nearing Anthropic's Claude Opus 4.5. Markets loved it: Zhipu's Hong Kong stock surged 28.7% to HK$402. As China floods the zone with homegrown models post-IPO, GLM-5 signals Beijing's serious play for AI independence and global competitiveness.
What happened
Zhipu AI just unleashed GLM-5, a massive open-source language model with 744 billion parameters trained on 28.5 trillion tokens. The flagship model doubles its predecessor's size and runs entirely on Huawei chips—no US hardware needed. Using DeepSeek's Mixture-of-Experts architecture, GLM-5 claims industry-leading coding chops, allegedly outperforming Google's Gemini 3 Pro and nearing Anthropic's Claude Opus 4.5. Markets loved it: Zhipu's Hong Kong stock surged 28.7% to HK$402.
Why it matters
As China floods the zone with homegrown models post-IPO, GLM-5 signals Beijing's serious play for AI independence and global competitiveness.