The next wave of the artificial intelligence revolution will be defined by efficiency, not just raw power. That’s the prediction from Chris Kelly, the former Chief Privacy Officer at Facebook, who believes the industry’s focus will shift from massive infrastructure buildouts to optimizing existing resources.
“Our brains operate on about 20 watts. We don’t need gigawatt power centers to reason,” Kelly stated in a recent interview. “I think efficiency will be a primary objective for the major AI players.”
This emphasis on efficiency could be a game-changer. Companies that can significantly reduce the cost and power consumption of data centers, the engine rooms of AI, are poised to lead the pack. The current landscape is marked by a frenzied expansion. In 2025 alone, the data center market saw over $61 billion in infrastructure deals, as hyperscalers globally engaged in a construction spree, according to S&P Global.
This expansion is fueled by substantial AI commitments. OpenAI, for instance, has pledged over $1.4 trillion for AI initiatives in the coming years, forging major partnerships with key players like NVIDIA, a leader in graphics processing units (GPUs), and infrastructure providers such as Oracle and Coreweave.
However, this data center boom is raising critical questions about energy supply. The existing electrical grid is already under strain, and the demand from AI infrastructure is only escalating. NVIDIA and OpenAI’s September announcement of a project encompassing at least 10 gigawatts of data center capacity highlights this challenge. This figure is roughly equivalent to the annual power consumption of 8 million U.S. households, or about the same as New York City’s peak summer demand in 2024, according to the New York Independent System Operator.
The escalating costs associated with this energy demand are a growing concern. This was further underscored in December 2024 when DeepSeek reportedly launched an open-source large language model for less than $6 million, a fraction of the cost typically associated with comparable models from U.S. competitors. This move suggests that innovation in cost-effective AI development is not only possible but is actively disrupting the market.
Kelly’s assertion suggests a pivot from the current arms race of raw computational power to a more sustainable and economically viable approach. The companies that master energy efficiency and cost reduction in their AI operations will likely be the true architects of its future.
Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/14911.html