
A person walks by a sign for Micron Technology headquarters in San Jose, California, on June 25, 2025.
Justin Sullivan | Getty Images
Micron announced Wednesday that it will exit the consumer memory market to concentrate on high‑performance AI chips that power next‑generation data centers.
“The AI‑driven growth in the data‑center sector has created a surge in demand for memory and storage,” said Sumit Sadana, Micron’s business chief. “We have made the difficult decision to exit the Crucial consumer business in order to improve supply and support for our larger, strategic customers in faster‑growing segments.”
The move underscores how the AI infrastructure boom is tightening supply of key components such as DRAM and high‑bandwidth memory (HBM). Industry analysts warn that the rapid rollout of AI workloads could lead to a prolonged global memory shortage.
Despite a 175% gain in Micron’s share price this year, the stock slipped 3% on Wednesday to $232.25 following the announcement.
Modern AI accelerators—from Nvidia’s GPUs to AMD’s Radeon Instinct processors and Google’s TPU—rely on massive quantities of the most advanced memory. Nvidia’s latest GB200 GPU, for example, integrates 192 GB of HBM per chip, while Google’s Ironwood TPU also requires 192 GB of high‑bandwidth memory. AMD’s MI350 AI processor is built around 288 GB of HBM, delivering a performance edge in large‑scale model training.
Historically, Micron’s “Crucial” brand supplied DRAM sticks and solid‑state drives for hobbyists and consumer laptops, typically ranging from 8 GB to 16 GB per system. By shedding that business, Micron can reallocate manufacturing capacity to the higher‑margin, higher‑growth AI segment.
In the high‑bandwidth memory market, Micron competes with South Korean giants SK Hynix and Samsung, but it remains the only U.S.‑based supplier. SK Hynix is currently Nvidia’s primary memory partner, while Samsung has deep ties with both Nvidia and Google.
Micron also supplies AMD, whose AI chips are designed to consume more memory than competing solutions, providing a measurable performance advantage. AMD’s MI350 platform, for instance, leverages Micron’s 288 GB HBM modules to accelerate deep‑learning workloads.
Although Micron does not break out the Crucial line in its earnings releases, its cloud‑memory segment posted a 213% year‑over‑year increase in the most recent quarter, indicating strong demand from hyperscale providers.
Goldman Sachs analysts raised their price target for Micron to $205 from $180, while maintaining a hold rating. The note highlighted “continued pricing momentum” in memory and projected upside to consensus estimates when Micron reports its next quarterly results.
A company spokesperson declined to comment on potential workforce reductions but said Micron would seek to redeploy affected employees into open positions across the organization.
Industry experts suggest that Micron’s strategic pivot could accelerate consolidation in the memory market, as vendors scramble to secure HBM capacity for AI. Companies that fail to secure sufficient supply may face higher component costs, potentially slowing AI deployment timelines. Conversely, firms that successfully lock in long‑term memory contracts stand to benefit from cost predictability and faster time‑to‑market for AI services.
Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/14001.html