‘Fake’ Water Worries, ‘Humans Use Energy Too’

OpenAI CEO Sam Altman dismissed concerns about AI’s water consumption as “fake,” comparing AI’s energy use to humans. While acknowledging energy is a concern, he argued AI’s inference stage is now more efficient than human task completion. This debate occurs as data center energy demands surge, prompting investment in new power sources but also community opposition.

OpenAI CEO Sam Altman has strongly refuted concerns surrounding the substantial resource demands of artificial intelligence, particularly those related to water consumption in data centers, deeming them “fake.” Speaking at the India AI Impact summit, Altman drew a parallel between the energy footprint of AI systems and that of human beings, suggesting a need for a more nuanced perspective.

When pressed on prevalent criticisms of AI, such as its extensive energy and water usage, Altman dismissed claims circulating online that services like ChatGPT consume vast quantities of water per query. He characterized these assertions as “completely untrue” and “totally insane,” asserting they lack any grounding in reality.

Traditionally, data centers have relied heavily on water for cooling their electrical components to prevent overheating. While advancements in data center cooling technologies have aimed to reduce this reliance, some newer facilities are reportedly moving away from water-based cooling altogether. Despite these efficiency improvements, a recent report from Xylem and Global Water Intelligence projects that water consumption for cooling could more than triple over the next 25 years, driven by escalating computing demands and placing significant pressure on water resources globally.

While downplaying water-related anxieties, Altman acknowledged that energy consumption remains a valid concern for AI development. He stated, “Not per query, but in total – because the world is using so much AI… and we need to move towards nuclear or wind and solar very quickly.”

Addressing previous remarks from Microsoft founder Bill Gates, who suggested that the inherent efficiency of the human brain indicates AI’s potential to evolve into a more energy-efficient technology over time, Altman offered a counterpoint. He argued that comparing the energy required to train an AI model with human development is inherently unfair. “It takes like 20 years of life, and all the food you eat before that time, before you get smart,” Altman explained, highlighting the extensive developmental period for human intelligence.

He proposed a more equitable comparison: the energy required for an AI model to answer a question versus a human performing the same task, once the AI model has been trained. “The fair comparison is if you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human, and probably AI has already caught up on an energy efficiency basis, measured that way,” he asserted. This comparison focuses on the “inference” stage of AI, which is significantly less power-intensive than the initial training phase.

Altman’s remarks, particularly the direct comparison between AI and human energy efficiency, have ignited considerable online discussion, underscoring growing anxieties about AI’s potential to disrupt the human workforce. Sridhar Vembu, co-founder and chief scientist of Zoho Corporation, who was also present at the summit, voiced his disapproval of equating technological tools with human beings.

This debate unfolds against a backdrop of substantial global investment in new data centers, essential for powering the burgeoning AI landscape. The International Monetary Fund noted in a May report that electricity consumption by global data centers in 2023 had already reached levels comparable to those of entire countries like Germany or France, shortly after the debut of OpenAI’s influential ChatGPT model.

In response to these escalating demands, some governments are streamlining approval processes for new, cost-effective energy sources. However, environmental advocates caution that such measures could conflict with international net-zero emission targets. Furthermore, local communities, particularly in the United States, have increasingly voiced opposition to data center projects, citing concerns about strain on electricity grids and potential increases in energy costs. A recent example includes the City Council of San Marcos, Texas, which voted down a proposed $1.5 billion data center project following months of public outcry.

Amidst this growing resistance, many tech leaders, including Sam Altman, advocate for an accelerated expansion of energy production from diverse sources, emphasizing the critical role of renewable and nuclear energy in meeting the future demands of data centers.

Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/19157.html

Like (0)
Previous 23 hours ago
Next 21 hours ago

Related News