$500 Cables Help Credo Capitalize on AI Boom

Credo, a semiconductor company specializing in high-speed connectivity, has seen its stock surge due to increasing demand for its active electrical cables (AECs) in AI infrastructure. Its AECs are crucial for connecting AI servers in data centers, with major clients like Amazon and Microsoft utilizing Credo’s signature purple cables. The company’s revenue doubled in fiscal year 2025, driven by the AI boom and hyperscalers’ data center expansions. While facing competition, Credo is expanding its product portfolio and collaborating with hyperscalers to meet the insatiable demand for AI connectivity solutions.

“`html
0 Cables Help Credo Capitalize on AI Boom

A demo setup of racks of AI servers connected with Credo cables, displayed at the Open Compute Summit in San Jose, California.

Credo

Back in July, Elon Musk offered a peek inside xAI’s Colossus 2 data center, a facility the AI startup hopes will become a supercomputing powerhouse located in Memphis, Tennessee.

While Musk’s posts on X showcased the meticulously organized purple cables connecting the servers, the focus on interconnectivity highlights a critical, often overlooked, aspect of the AI infrastructure boom.

These signature purple cables are the product of Credo, a 17-year-old Silicon Valley-based semiconductor company. While Credo may not be a household name like Nvidia, the company’s specialized offerings are becoming increasingly crucial in supporting the demands of AI compute.

Wall Street is taking notice.

Credo’s stock has more than doubled this year to $143.61, adding to a hefty 245% surge in 2024. Its market capitalization, roughly $1.4 billion at its 2022 IPO, now stands near $25 billion. Credo is strategically positioning itself as a critical supplier in the trillion-dollar AI infrastructure race, capitalizing on the escalating investments across the sector. The active electrical cable (AEC) market, a domain where Credo has been a pioneering force, is projected to reach $4 billion by 2028 as hyperscalers pump vast sums into data center expansions.

“The industry outlook is supported by increasing deployments from major companies such as Amazon, Microsoft, and xAI as well as broadening adoption, including Meta and more,” analysts wrote, projecting substantial annualized revenue growth for Credo through 2028.

Fiscal year 2025, which ended in early May, saw Credo’s revenue more than double to $436.8 million. More impressively, the company swung to profitability, posting net income of $52.2 million compared to the previous year’s $28.4 million loss. Wall Street expects this growth trajectory to continue, with sales forecasted to nearly double again in fiscal 2026, approaching $1 billion, according to LSEG data.

According to estimates from industry researcher 650 Group, Credo’s signature purple AECs command prices between $300 and $500 each, influenced by volume discounts and negotiated terms. These robust copper cables, shielded by a braided covering and featuring chip-equipped connectors at each end, are designed to handle the immense data throughput required by AI workloads.

The AI boom is undeniably the primary catalyst for Credo’s ascendance. However, the current surge is largely fueled by a select group of hyperscale companies aggressively scaling data centers to accommodate future AI workloads. Experts forecast a staggering $1 trillion in investments in AI data centers by 2030. A potential deceleration in capital expenditure from major cloud providers or a retrenchment in OpenAI’s ambitions, could introduce headwinds for Credo and other suppliers reliant on this expansion.

For the moment, projections remain bullish.

Expanding Opportunity

Traditional servers were configured with one or two processors on a single motherboard. Today, individual servers can accommodate up to eight processors, and cutting-edge AI models can demand the collective processing power of millions of GPUs interacting as a unified computational entity.

Each GPU requires a dedicated connection to the switch, a component responsible for routing data across the cluster.

Nvidia’s latest architectures integrates multiple boards to construct systems accommodating a substantial array of GPUs. Furthermore, the next generation of high-performance racks will double this capacity. Nvidia’s roadmap includes the advanced Kyber rack design which will contain an ever larger GPU count, enabling vastly increased processing capabilities.

“In the past, Credo’s opportunity was one cable per server, but now Credo’s opportunity is multiple cables per server,” noted Alan Weckel, an analyst at 650 Group. He estimates that Credo holds over a dominant share of the AEC market, also served by Astera Labs and Marvell.

While fiber optic cables, often powered by components from companies like Broadcom, provide the backbone for many of these GPU connections, AECs present an alternative approach. Featuring chips on both ends, AECs employ sophisticated algorithms to extract data from the cable, allowing for longer transmission distances compared to traditional copper cables. Credo’s longest AEC can stretch up to seven meters.

Credo CEO Bill Brennan highlighted that hyperscalers prefer Credo’s cables for their superior reliability compared to fiber optic cables. According to Brennan, customers are actively seeking to mitigate “link flaps,” where a disruption in an AI cluster due to optical cable failure can result in costly downtime.

“It can literally shut down an entire data center,” Brennan stated.

Credo is seeing increasing engagement with key hyperscalers during the early planning stages of large-scale AI clusters. This collaboration is particularly beneficial as designs become more compact, allowing for an increase in server density and use of shorter cables.

“When you connect with these hyperscalers, the numbers are very large,” Brennan said.

Credo’s AEC leadership team.

Corey Bentley, Credo

While Credo remains tight-lipped about its specific hyperscaler clients, analysts have identified Amazon and Microsoft as key customers. Proof of this partnership came in the form of Amazon Web Services CEO Matt Garman who posted an image on LinkedIn showcasing the company’s Trainium AI chip racks with Credo’s signature purple cables clearly visible.

Credo anticipates that three or four customers will account for exceed 10% of overall revenue in the upcoming quarters, notably with new hyperscale clients onboarding this year.

Amazon and Microsoft declined to comment. Meta and xAI did not respond to requests for comment.

At a recent data center conference, Credo shared the stage with a representative from Oracle Cloud. An example rack of Nvidia GPUs crafted by Meta and showcased at the event notably featured Credo’s distinct purple cables.

“Every time you see a new announcement of a gigawatt data center, you can rest assured that we view that as an opportunity,” Brennan told investors on an earnings call in September.

The market is crowded by competition for dominance in AI networking. While TD Cowen analysts estimate the market for AI networking chips could hit $75 billion annually by 2030, major players like Nvidia and Advanced Micro Devices (AMD), each commanding substantial networking divisions, have significant control over which technologies will be supported in the broader AI ecosystems.

‘Insatiable Demand’

Credo’s origins trace back to 2008, when a team of former Marvell engineers established the company to focus on SerDes (Serializer/Deserializer) technology, a niche yet essential component for high-speed chip-to-chip communication.

When Brennan joined in 2013, his mission was to transform this technology into commercially viable products. The company secured its initial round of venture funding, with investors including Walden International led by Lip-Bu Tan, now Intel’s CEO.

Brennan credits the AI boom of the early 2020s as the decisive catalyst for the AEC business, as data centers grew rapidly and needed advanced technology.

Before the widespread AI demand, Tesla reached out back in 2017. The electric car manufacturer needed help with its Dojo AI supercomputer and sought chips that could deliver more bandwidth than what was commercially available at the time.

Leveraging its position in active copper cables, Credo is now expanding its product portfolio to include intra-rack interconnects, which is commonly known as “scale-up” networking. They announced new transceivers and software for optical cables this week.

“You’ve got this market pull like we’ve never had before,” Brennan said. “If you could deliver the next generation right now, it would be consumed. Generation after that, it would be consumed. You’ve got this insatiable demand from the AI cluster world.”

“`

Original article, Author: Tobias. If you wish to reprint this article, please indicate the source:https://aicnbc.com/11130.html

Like (0)
Previous 1 day ago
Next 1 day ago

Related News