•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•

Baidu has formally unveiled ERNIE 5.1, a new language model positioned as a step forward in both performance and cost efficiency. The company says training ERNIE 5.1 is about 94% cheaper than AI systems of similar scale, framing the release as a solution to resource optimization in the next generation of AI.
Baidu attributes the cost reduction to how it built ERNIE 5.1 rather than training from scratch. The model is based on its predecessor, ERNIE 5.0, which was launched in January 2026. Instead of running separate training cycles for different model sizes, Baidu used an elastic training framework called “Once-For-All.”
Under this approach, ERNIE 5.1 is derived from a larger “mother” model by extracting a sub-network optimized for the new model. The ERNIE 5.0 architecture is described as having about 2.4 trillion parameters. ERNIE 5.1 is then configured as a leaner model with total parameters at roughly one-third of the original, while active parameters—those used to generate responses—are halved. Baidu says inheriting knowledge from the larger model without repeating the most expensive training process is what enables savings of up to 94% of the budget.
Baidu also restructured its reinforcement learning workflow. Rather than updating model components in a single rigid sequence, it split model updates, response generation, and evaluation into independent modules coordinated by a central control unit. The goal is to run each component on hardware best suited to it, reducing bottlenecks that could slow the overall pipeline.
For numerical stability in mixture-of-experts (MoE) models, Baidu implemented a standardized low-precision computing library, which it says halves instability without reducing throughput.
To manage the “seesaw effect” that can occur when training models on many skills at once—where gains in one area can come at the expense of another—Baidu introduced a four-stage fine-tuning process called MOPD (Multi-Teacher On-Policy Distillation).
Baidu says the final reinforcement learning step is critical because distillation from teachers to students can produce overly polished responses that lack diversity. The company describes the overall process as producing ERNIE 5.1 with balanced capabilities across skills.
Baidu says ERNIE 5.1 performs strongly on multiple benchmarks. On LMArena Search Arena, where AI models are scored by real users on web search tasks, ERNIE 5.1 scored 1.223 as of May 9, ranking fourth globally and first among all China-developed models.
In advanced knowledge and reasoning tests, Baidu says ERNIE 5.1 approaches the performance of top Western closed-source models such as Gemini 3.1 Pro from Google. In math exams such as the AIME26, it achieved 99.6% accuracy when using reasoning tools, trailing only Gemini 3.1 Pro.
For agent-related tasks—such as handling complex spreadsheets or multi-step web browsing—Baidu says ERNIE 5.1 surpassed DeepSeek-V4-Pro.
On deployment, Baidu states ERNIE 5.1 is available on more than ten Chinese platforms, including the Isekai Zero role-playing platform, the Storymaster film tool, and the Diting Huanliu graphics app. Developers can also access the model via Baidu Cloud AI API. Baidu has not disclosed the model weights for independent verification, but the company says ERNIE 5.1’s presence on international leaderboards supports its standing.
Baidu plans to continue showcasing ERNIE and its applications at the Create 2026 developer conference in mid-May in Beijing, with an emphasis on pushing the model into the enterprise market and scaling globally.
Source: Decrypt, The Decoder.
Premium gym chains are entering a “golden era” that is ending or already in decline, as rising operating costs collide with shifting consumer preferences toward more flexible, community-based ways to exercise. Long-term memberships are shrinking, margins are pressured by higher rents and facility expenses, and competition from smaller, more personalized…