CNBC AI News – June 30, 2025: Huawei is making a significant move in the AI landscape today, officially announcing the open-sourcing of its Pangu 7 billion parameter dense model and the Pangu-Pro MoE 72 billion parameter Mixture-of-Experts model. Crucially, the company is also releasing its Ascend-based model inference technology.
This strategic decision underscores Huawei’s commitment to its Ascend ecosystem strategy. By championing open access to its advanced large language models and underlying inference capabilities, Huawei aims to accelerate research, foster innovation in LLM technology, and drive the practical application and value creation of artificial intelligence across a broad spectrum of industries.
According to information from Huawei’s official website:
The Pangu-Pro MoE 72B model weights and foundational inference code have been officially published on an open-source platform.
The inference code for the ultra-large-scale MoE model, optimized for Ascend, is also now available on an open-source platform.
The Pangu 7B model weights and inference code are slated for release on the open-source platform in the near future.
The Pangu-Pro MoE large model, built on the MoGE architecture with a total of 72 billion parameters and 16 billion active parameters, demonstrates impressive performance. On Huawei’s Ascend 300I Duo and 800I A2 platforms, it achieves superior expert load balancing and computational efficiency, delivering inference speeds of 321 tokens/s and 1528 tokens/s, respectively.
In terms of raw capability, the Pangu-Pro MoE has garnered significant attention with its strong showing on the latest SuperCLUE benchmark, a respected industry standard for evaluating large language models.
Standing out amongst its peers, which often boast parameter counts well into the hundreds of billions (for instance, DeepSeek-R1 with 671B parameters), the Pangu-Pro MoE, with its 72B total parameters, has achieved an impressive score of 59. This places it at the forefront, tied for the top position among domestic models with fewer than 100 billion parameters.
Furthermore, its 16B active parameters demonstrate a level of performance that rivals much larger models from other industry players.
Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/3619.html