Zhipu Unveils GLM-5 Model With 744B Parameters, Intensifying Chinese AI Race With DeepSeek

Reviewed byNidhi Govil

5 Sources

Share

Chinese AI startup Zhipu released GLM-5, a new flagship model with 744 billion parameters that rivals Anthropic's Claude Opus in coding benchmarks. Trained entirely on Huawei Ascend chips, the open-source model achieved a record-low hallucination rate and sent Zhipu shares surging 34%. The release intensifies competition among Chinese AI companies ahead of DeepSeek's anticipated next-generation model launch.

Zhipu Releases GLM-5 as Chinese AI Companies Accelerate Innovation

Zhipu has officially launched GLM-5, its latest large language model designed to tackle complex coding capabilities and agent tasks, marking a significant escalation in the Chinese AI race. The Beijing-based company, which went public on the Hong Kong Stock Exchange last month, released the new flagship model on February 11, joining domestic rivals in unveiling sophisticated AI models ahead of the Lunar New Year festival

1

2

. The timing reflects heightened anticipation around DeepSeek's expected next-generation architecture release during the holiday, which industry observers believe will set a new benchmark for Chinese open source models

1

.

Source: ET

Source: ET

Technical Achievements and Competitive Positioning

GLM-5 represents a massive leap in scale and capability, featuring approximately 744 billion total parameters with 40-44 billion active per token in its Mixture-of-Experts architecture—more than double the 355 billion parameters of its predecessor

3

. The AI model was trained on 28.5 trillion tokens and developed using domestically manufactured chips for inference, including Huawei Ascend chips and products from Moore Threads, Cambricon, and Kunlunxin

2

4

. This full independence from US-manufactured hardware positions GLM-5 as a milestone in China's drive toward chip self-sufficiency

4

. The model achieved a record-low hallucination rate on the independent Artificial Analysis Intelligence Index v4.0, with a score of -1 on the AA-Omniscience Index—a 35-point improvement over its predecessor that now leads the entire AI industry, including competitors like Google, OpenAI, and Anthropic, in knowledge reliability

3

.

Performance Benchmarks Against Global Rivals

In coding benchmark tests, GLM-5 approaches Anthropic's Claude Opus 4.5 performance levels and surpasses Google's Gemini 3 Pro on some benchmarks, according to company statements

2

. On SWE-bench Verified, GLM-5 achieved a score of 77.8, outperforming Gemini 3 Pro at 76.2 and approaching Claude Opus 4.6 at 80.9

3

. The model also ranked first among open-source models on Vending Bench 2, a business simulation test, with a final balance of $4,432.12

3

. According to Artificial Analysis, GLM-5 is now the most powerful open source model globally, surpassing Moonshot AI's Kimi K2.5 released just two weeks earlier

3

.

Novel Training Infrastructure and Agentic Engineering

To address training inefficiencies at this scale, Zhipu developed "slime," a novel asynchronous reinforcement learning infrastructure that breaks traditional lockstep bottlenecks

3

. By integrating system-level optimizations like Active Partial Rollouts, slime addresses generation bottlenecks that typically consume over 90% of reinforcement learning training time, significantly accelerating iteration cycles for complex agentic engineering tasks

3

.

Source: VentureBeat

Source: VentureBeat

The model features native Agent Mode capabilities that transform raw prompts into professional office documents, including ready-to-use .docx, .pdf, and .xlsx files, enabling it to autonomously generate detailed financial reports, sponsorship proposals, and complex spreadsheets that integrate directly into enterprise workflows

3

.

Market Impact and Competitive Pricing Strategy

The Chinese AI startup's share price surged as much as 34% following GLM-5's launch, while competitor MiniMax saw an 11% rise after releasing its updated M2.5 model the same week

4

. Markets have shown elevated sensitivity to new artificial intelligence releases this month, with Zhipu itself jumping more than 50% this week after JPMorgan initiated coverage

1

. Following the launch, Zhipu raised pricing for its GLM Coding Plan by 30%, positioning it as comparable to Anthropic's Claude Code, which remains unavailable in China

4

.

Source: Silicon Republic

Source: Silicon Republic

Despite the increase, GLM-5 is aggressively priced at approximately $0.80 per million input tokens and $2.56 per million output tokens on OpenRouter—roughly 6 times cheaper on input and nearly 10 times cheaper on output than Claude Opus 4.6 at $5/$25

3

.

Strategic Context in China's AI Boom

Valued at roughly $18 billion, Zhipu is the first major global large language model builder to list on a stock market and is considered one of China's "AI tigers"—a group of promising AI startups vying with the United States to lead frontier technology development

1

2

. Founded in 2019 by Tsinghua University researchers with backing from government funds and early investment by Alibaba Group and Tencent Holdings, Zhipu is now transitioning from building customized AI solutions for business clients in China to offering its technology to global users

1

. In an internal memo after the company's IPO, co-founder and chief AI scientist Tang Jie stated that "DeepSeek was a wake-up call," emphasizing the need to return to fundamental research

1

. The release follows a series of updates including version 4.7 last month and version 4.6 in September, demonstrating rapid iteration as Chinese tech companies release a flurry of new models to capitalize on the AI boom and compete with U.S. rivals

2

.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo