What Could Broadcom's Latest Earnings Report Mean for Nvidia Investors?


The demand for the chips required for training and deploying artificial intelligence (AI) models and applications in data centers increased at a remarkable pace in the past couple of years. Nvidia (NVDA 0.39%) has been the biggest beneficiary of this fast-growing market as it left its fellow chipmakers in the dust by capturing the lion’s share of the AI semiconductor space.

No other chipmaker comes close to Nvidia in terms of revenue generated from selling AI chips. For instance, in the third quarter of fiscal 2025 alone (which ended on Oct. 27), Nvidia sold $30.8 billion worth of data center chips, a remarkable increase of 112% from the same period last year. Its nearest rival in data center graphics processing units (GPUs) — Advanced Micro Devices — is forecasting $5 billion in data center GPU revenue this year.

However, GPUs aren’t the only type of chip that cloud service providers are using for AI model training and inferencing. This is evident from the rapid growth that Broadcom (AVGO 3.15%) reported in the demand for its application-specific integrated circuits (ASICs) in the latest fiscal year, thanks to AI-related demand.

Let’s take a closer look at the growth of Broadcom’s AI chip business and see if this could be a reason for Nvidia investors to worry.

Broadcom’s AI growth points toward terrific demand for custom AI processors

Broadcom finished fiscal 2024 (which ended on Nov. 3) with $12.2 billion in revenue from sales of its custom AI accelerators — which it brands as XPUs — and networking chips needed in AI data centers. Of course, that’s nowhere near the annual revenue that Nvidia is on track to generate from its data center business this year.

After all, Nvidia’s data center revenue in the first nine months of the current fiscal year stands at a massive $79.6 billion. However, investors should note that the terrific pace of growth in this segment has been decelerating of late. That’s not surprising, considering that Nvidia already attained a massive revenue base. It is currently in the middle of a transition to a new generation of AI data center chips based on its Blackwell architecture.

At the same time, the big jump in Broadcom’s AI revenue shows that the demand for alternative chips to tackle AI workloads is on the rise. Again, that’s not surprising. Major cloud companies have been working on in-house chips to cut costs and speed up AI development, given how expensive Nvidia’s AI chips are and the long wait times they require during production.

This trend is playing in Broadcom’s favor as its ASICs — chips designed for performing specific tasks, which offer improved performance and efficiency over GPUs — are reportedly being used by big names such as Alphabet‘s Google, Meta Platforms, ByteDance, and even OpenAI. This trend is likely to continue in the future as well. According to McKinsey, ASICs are expected to “serve the majority of workloads” in AI accelerators.

Not surprisingly, the market for AI-specific ASICs is expected to show an annual growth of 32% through 2030, according to Lucintel. Broadcom management struck a confident note on the latest earnings conference call, with CEO Hock Tan stating:

As you know, we currently have three hyper-scale customers who have developed their own multi-generational AI XPU road map to be deployed at varying rates over the next three years. In 2027, we believe each of them plans to deploy 1 million XPU clusters across a single fabric. We expect this to represent an AI revenue serviceable addressable market, or SAM, for XPUs and network in the range of $60 billion to $90 billion in fiscal 2027 alone.

Broadcom, therefore, believes that the market for custom AI chips is set to grow exponentially over the next three years.

Should Nvidia investors be worried?

Concerns about a gradual slowdown in Nvidia’s AI-fueled growth and the rising competition in the AI chip space are probably why shares of the company headed lower in the past month, despite posting better-than-expected results on November 20. However, investors shouldn’t be worrying just yet.

That’s because the demand for AI-focused GPUs is expected to be way higher than that of custom chips in the future. We saw that Broadcom is forecasting the custom AI processor market to hit $60 billion to $90 billion in annual revenue in the next three years. However, the overall size of the AI accelerator market — including central procesing units (CPUs), GPUs, and ASICs — is expected to hit $500 billion in 2028, according to AMD.

Other reports also suggest something similar, with one estimate putting the size of the AI chip market at $621 billion in 2032, of which 34% are expected to be GPUs. So, the size of the AI GPU market is expected to exceed $200 billion in the long run, suggesting that Nvidia still has more room for growth. Moreover, Nvidia’s data center opportunity isn’t just limited to AI. The transition to accelerated computing means that the company is sitting on a huge addressable opportunity that could drive impressive growth for years to come.

So, even though Broadcom’s AI business has gained impressive momentum, Nvidia investors shouldn’t be worried. The latter company can still deliver healthy gains in the long run.

Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. Harsh Chauhan has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Alphabet, Meta Platforms, and Nvidia. The Motley Fool recommends Broadcom. The Motley Fool has a disclosure policy.



Source link

About The Author

Scroll to Top