Baidu Open-Sources WENQING 4.5 Series Models, Featuring 10 AI Models

Baidu has released its ERNIE 4.5 open-source large language model series, featuring ten models including Mixture-of-Experts (MoE) variants. These models offer fully open-sourced pre-training weights and inference code, accessible via platforms like Hugging Face. The series boasts innovative heterogeneous MoE architecture for multimodal capabilities and achieves state-of-the-art performance on various benchmarks, outperforming competitors in text and multimodal tasks. The models are distributed under the Apache 2.0 license, promoting both academic and commercial use.

CNBC AI News – June 30, 2025

Baidu has officially thrown open the doors to its ERNIE 4.5 large language model series, releasing a comprehensive suite of ten models. This significant move includes their 47 billion and 3 billion parameter Mixture-of-Experts (MoE) models, alongside a 0.3 billion parameter dense model, all with fully open-sourced pre-training weights and inference code.

Developers and researchers can now access and deploy the ERNIE 4.5 open-source series through platforms like the PaddlePaddle Xinghe Community and Hugging Face. Additionally, API services for these models are available on the Baidu AI Cloud Qianfan Large Model Platform.

Baidu Launches ERNIE 4.5 Open-Source Series: Featuring 10 Models

In a substantial release, Baidu’s unveiling of ten ERNIE 4.5 models positions the company at the forefront of the industry. This release is notable for its leadership in the number of independently developed models, model diversity, parameter richness, and the permissive and reliable nature of its open-source licensing.

The ERNIE 4.5 open-source series introduces an innovative heterogeneous model architecture for MoE, specifically designed for continuous pre-training from large language models to multimodal models. This approach not only maintains, but often enhances, text-based task performance while significantly boosting multimodal understanding capabilities. Key to this superior performance are its multimodal MoE pre-training, an efficient training and inference framework, and tailored post-training techniques for different modalities.

All models within the ERNIE 4.5 open-source series were trained and optimized for efficient inference and deployment using the PaddlePaddle deep learning framework. During large language model pre-training, the models achieved an impressive 47% of their potential FLOPs utilization (Model FLOPs Utilization – MFU).

Baidu Launches ERNIE 4.5 Open-Source Series: Featuring 10 Models

Experimental results indicate that the ERNIE 4.5 series achieves state-of-the-art (SOTA) performance across numerous text and multimodal benchmarks, demonstrating particular strengths in instruction following, world knowledge recall, visual comprehension, and multimodal reasoning tasks.

In the realm of text models, the ERNIE 4.5 open-source series boasts strong foundational capabilities, high factual accuracy, robust instruction adherence, and exceptional reasoning and coding abilities. It surpasses prominent models like DeepSeek-V3 and Qwen3 in several mainstream benchmarks.

For multimodal applications, the ERNIE 4.5 open-source series exhibits outstanding visual perception, a rich understanding of visual common sense, and a unified approach to “thinking” and “non-thinking” processes. In leading multimodal large model evaluations for visual common sense, multimodal reasoning, and visual perception, it outperforms even proprietary models like OpenAI’s o1.

Furthermore, Baidu’s lightweight offerings shine. The ERNIE 4.5-21B-A3B-Base text model demonstrates performance comparable to Qwen3 of similar size. The ERNIE 4.5-VL-28B-A3B multimodal model stands out as the best open-source multimodal model in its parameter class, rivaling even larger models like Qwen2.5-VL-32B.

The ERNIE 4.5 open-source series weights are released under the permissive Apache 2.0 license, supporting both academic research and commercial applications. The accompanying industry-grade development kits, powered by PaddlePaddle, offer broad chip compatibility and significantly lower the barriers for model post-training and deployment.

As one of the earliest companies to invest heavily in AI research in China, Baidu has cultivated a distinct full-stack AI technological advantage, spanning compute power, frameworks, models, and applications. Its PaddlePaddle deep learning platform, China’s first self-developed, feature-rich, and open-source industrial-grade platform, underpins this ecosystem, built upon years of accumulated open-source technology and community engagement.

Coinciding with the ERNIE 4.5 open-source series launch, Baidu also unveiled ERNIE Kit for LLM development and FastDeploy for efficient large model deployment. These tools provide developers with ready-to-use solutions and comprehensive support for the ERNIE 4.5 series and beyond.

Baidu Launches ERNIE 4.5 Open-Source Series: Featuring 10 Models

Baidu Launches ERNIE 4.5 Open-Source Series: Featuring 10 Models

Baidu Launches ERNIE 4.5 Open-Source Series: Featuring 10 Models

Baidu Launches ERNIE 4.5 Open-Source Series: Featuring 10 Models

Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/3627.html

Like (0)
Previous 10 hours ago
Next 8 hours ago

Related News