“`html

Nvidia CEO Jensen Huang stated on Wednesday that the surge in demand for AI computing is outpacing expectations, driven by the rapid evolution of artificial intelligence models from simple query responses to complex reasoning capabilities.
“This year, particularly in the last six months, the computational needs have increased dramatically,” Huang commented during an interview on CNBC’s “Squawk Box.”
Addressing questions about investor concerns, Huang, whose company’s stock saw a 2% rise on Wednesday, contributing to the Nasdaq Composite’s upward trend, emphasized the performance-driven phenomenon fueling Nvidia’s growth. AI reasoning models, demanding exponential computing resources, are also experiencing exponential demand due to the superior results they deliver.
“The enhanced intelligence of these AIs is leading to widespread adoption,” Huang explained. “We are witnessing the convergence of two exponential growth curves.”
He further highlighted the overwhelming demand for Blackwell, Nvidia’s most advanced graphics processing unit. “Demand for Blackwell is exceptionally high,” he stated. “We are at the cusp of a significant new buildout, the dawn of a new industrial revolution.” This echoes sentiments shared by other industry leaders regarding the potential scale and transformative impact of AI.
Adding to the infrastructural investment in AI, Nvidia had announced an investment of $100 billion in OpenAI’s data center expansion last month. OpenAI plans to erect 10 gigawatts of data centers powered by Nvidia chips. This ambitious project underscores the power requirements for cutting-edge AI development and deployment, prompting discussions about energy sustainability.
The scale of the AI industry’s infrastructure plans has raised concerns about whether leading companies can secure enough power to support their ambitions. At ten gigawatts, it would represent nearly equal to the electrical consumption of 8 million U.S. households, or to New York City’s peak baseline consumption in the summer of 2024.
When asked about the global AI race, Huang acknowledged that the U.S. and China are currently competitive in the field, noting that Beijing is deploying AI and the infrastructure required to run it at a faster pace than the U.S. It is worth noting that differing regulatory environments and governmental policies may play a role in the speed of this deployment.
“China is significantly ahead in terms of energy infrastructure deployment,” said Huang.
Huang advocated for the artificial intelligence sector to invest in off-grid power generation to meet growing demand and avoid impacting consumer electricity prices. He proposed equipping data centers with natural gas generators, with possible nuclear power options in the future. The economic and logistical difficulties and benefits of self-generated power for data centers is an ongoing topic of discussion among energy and technology experts.
“We should consider every possible approach to energy generation,” Huang emphasized. “Data center self-generated power has the ability to move much faster than grid-supplied electricity, and it is imperative that we pursue that.”
“`
Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/10575.html