
Lisa Su, chair and chief executive officer of Advanced Micro Devices Inc. (AMD), during a Bloomberg Television interview in San Francisco, California, US, on Monday, Oct. 6, 2025.
David Paul Morris | Bloomberg | Getty Images
AMD CEO Lisa Su outlined an ambitious growth trajectory for the company, projecting an overall revenue expansion of approximately 35% per year over the next three to five years, primarily fueled by the surging demand for artificial intelligence chips. Su emphasized that this growth would be largely propelled by AMD’s AI data center business, which is anticipated to grow at a robust rate of roughly 80% annually during the same period, potentially reaching tens of billions of dollars in sales by 2027.
“This is what we see as our potential given the customer traction, both with the announced customers, as well as customers that are currently working very closely with us,” Su told analysts, highlighting the increasing customer confidence in AMD’s AI solutions.
According to Su, AMD aims to secure a “double-digit” share in the burgeoning data center AI chip market within the next three to five years. This ambition signals a direct challenge to the current market dominance of Nvidia, which holds a significant market share exceeding 90% by some estimates.
While AMD shares initially dipped 3% in extended trading, they recovered ground following the company’s announcement that its gross margins are projected to fall between 55% and 58% in the coming years. This figure surpassed analyst expectations, reflecting the company’s improving profitability and operational efficiency. The market reaction underscores the delicate balance between growth projections and immediate financial performance.
AMD’s financial analyst day, the first since 2022, served as a platform to showcase the company’s strategic position at the heart of the AI-driven data center spending boom. As companies invest heavily in GPUs to power AI applications like OpenAI’s ChatGPT, they are actively seeking alternative solutions to mitigate costs and diversify their supply chains. AMD, as the primary competitor to Nvidia in the GPU market, stands to benefit significantly from this trend.
In a pivotal move, AMD announced a partnership with OpenAI in October, committing to supply the AI startup with billions of dollars’ worth of its Instinct AI chips over several years. With shipments potentially starting in 2026, the initial phase involves delivering sufficient chips to utilize 1 gigawatt of power, a substantial commitment that underscores the scale of the collaboration.
The agreement between AMD and OpenAI could potentially see OpenAI acquiring a 10% stake in AMD. In addition to the OpenAI partnership, Su highlighted strategic, long-term collaborations with Oracle and Meta, further diversifying AMD’s customer base and reinforcing its position in the AI ecosystem.
Fueled by these developments, AMD shares have nearly doubled in value year-to-date in 2025, reflecting investor confidence in the company’s growth prospects.
OpenAI plays a crucial role in the development of AMD’s next-generation systems, built around its Instinct MI400X AI chips, slated for release in the coming year. This collaboration extends beyond mere chip supply; it involves optimizing AMD’s hardware for cutting-edge AI applications.
AMD emphasizes the rack-scale capabilities of its chips, which are designed to be assembled into systems where 72 chips operate cohesively. This unified approach is critical for running large AI models and competing effectively with Nvidia’s rack-scale systems, which have been available for three product generations.
Su estimates that the total addressable market for AI data center parts and systems will reach $1 trillion per year by 2030, representing an impressive annual growth rate of 40%. This revised forecast is a significant uptick from the company’s previous estimate of $500 billion by 2028, reflecting an increasingly bullish outlook on the AI market. AMD reported $5 billion in AI chip sales in its fiscal 2024.
The updated figure includes central processors (CPUs), particularly AMD’s Epyc series. These Epyc CPUs are the top-selling products of AMD. While CPUs are integral to computer systems, they are not pure AI accelerators like GPUs. Despite the inclusion of CPUs in the market forecast, the core driver remains the accelerating demand for specialized AI chips.
AMD continues to innovate and compete in the CPU market against Intel and other ARM-based processors. Although AMD’s emphasis on AI has taken center stage, the company maintains a diverse portfolio of chips for gaming consoles, networking, and various other applications. This diversification helps to mitigate risk and capitalize on opportunities across the broader semiconductor landscape.
Despite the heightened focus on its burgeoning AI business, AMD reassured shareholders that its legacy businesses are also experiencing growth. “The other message that we want to leave you with today is every other part of our business is firing on all cylinders, and that’s actually a very nice place to be,” Su stated, underscoring the company’s balanced growth strategy.
Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/12680.html