China's Zhipu AI Drops 744B-Parameter GLM-5, Trained Entirely on Huawei Chips

February 15, 2026

China's Zhipu AI Drops 744B-Parameter GLM-5, Trained Entirely on Huawei Chips

Published: February 15, 2026 at 8:12 AM

Updated: February 15, 2026 at 8:12 AM

100-word summary

Zhipu AI just unleashed GLM-5, a massive 744-billion-parameter open-source model trained exclusively on domestic Huawei Ascend chips—no US hardware involved. Released mid-February, it packs 40 billion active parameters, a 200K-token context window, and claims top scores in coding and agentic benchmarks, even outpacing Gemini 3 Pro internally. The model drops under MIT license on Hugging Face, with API access starting at $0.80 per million tokens. It features autonomous "Agent Mode" for web browsing and office file generation. Fresh off a $7.3B Hong Kong IPO, Zhipu is flexing China's AI hardware independence while undercutting Western pricing. This signals Beijing's push toward self-sufficient AI infrastructure, potentially reshaping global competition as Chinese models...

What happened

Zhipu AI just unleashed GLM-5, a massive 744-billion-parameter open-source model trained exclusively on domestic Huawei Ascend chips—no US hardware involved. Released mid-February, it packs 40 billion active parameters, a 200K-token context window, and claims top scores in coding and agentic benchmarks, even outpacing Gemini 3 Pro internally. The model drops under MIT license on Hugging Face, with API access starting at $0.80 per million tokens. It features autonomous "Agent Mode" for web browsing and office file generation. Fresh off a $7.3B Hong Kong IPO, Zhipu is flexing China's AI hardware independence while undercutting Western pricing.

Why it matters

This signals Beijing's push toward self-sufficient AI infrastructure, potentially reshaping global competition as Chinese models scale without reliance on restricted chips.

Sources