China's Z.ai Drops 744B-Parameter GLM-5 Model, Ditches US Chips

February 15, 2026

China's Z.ai Drops 744B-Parameter GLM-5 Model, Ditches US Chips

Published: February 15, 2026 at 8:12 AM

Updated: February 15, 2026 at 8:12 AM

100-word summary

Z.ai just released GLM-5, a massive 744-billion-parameter open-source LLM trained entirely on Huawei chips—no US GPUs needed. The model uses 256 experts but only activates 40 billion parameters per inference, packing a 200,000-token context window and crushing benchmarks like SWE-bench Verified (77.8%). Released under MIT-style licensing with weights on HuggingFace, it doubles GLM-4.7's size and adds DeepSeek Sparse Attention for agentic workflows. Fresh off its Hong Kong IPO raising $558M, Z.ai hiked coding subscription prices 30% to meet surging demand. GLM-5's Huawei-only training marks a pivotal step toward China's AI compute independence, potentially reshaping the global frontier model race as domestic hardware proves viable at scale.

What happened

Z.ai just released GLM-5, a massive 744-billion-parameter open-source LLM trained entirely on Huawei chips—no US GPUs needed. The model uses 256 experts but only activates 40 billion parameters per inference, packing a 200,000-token context window and crushing benchmarks like SWE-bench Verified (77.8%). Released under MIT-style licensing with weights on HuggingFace, it doubles GLM-4.7's size and adds DeepSeek Sparse Attention for agentic workflows. Fresh off its Hong Kong IPO raising $558M, Z.ai hiked coding subscription prices 30% to meet surging demand.

Why it matters

GLM-5's Huawei-only training marks a pivotal step toward China's AI compute independence, potentially reshaping the global frontier model race as domestic hardware proves viable at scale.

Sources