Tech & Gear

AMD’s AI Server Push Could Get Huge, With UBS Expecting 80% CPU Revenue Jump

By Aimirul|
Share

AMD’s data centre story is getting spicy, and no, this is not just another “AI hype” headline.

According to UBS estimates cited by Wccftech, AMD’s server CPU revenue could climb by around 80% this year, helped by strong demand for AI infrastructure and a market where Intel may not be able to fully supply what customers want. For a company already pushing hard with EPYC server processors, that is a pretty big signal.

The key point here: AI is not only about GPUs. Agentic AI systems, inference workloads, cloud platforms, and enterprise AI tools still need serious CPU muscle behind the scenes. That is where AMD’s EPYC chips come in, especially for companies building large-scale server clusters.

UBS reportedly sees Intel’s recent guidance as indirectly positive for AMD, with commentary suggesting Intel could be undershipping the broader server CPU market by roughly 20%. If customers need more server chips and Intel cannot fully cover that demand, AMD has a clean opening to grab more share.

AMD’s AI GPU numbers are also climbing

The CPU side is only half the cerita. UBS has also raised its forecast for AMD’s Instinct AI GPU shipments across the next few years.

Based on the figures shared, AMD is now expected to ship around 544,500 AI server GPUs in 2025, rising to about 816,800 units in 2026. By 2027, the estimate jumps close to 1.9 million AI GPUs, which would be a major step up if AMD can execute properly.

The current expectation is that AMD’s MI350 series will drive much of the 2026 momentum, while the MI450 family starts ramping and becomes a bigger part of the 2027 story. Further ahead, AMD’s MI500 GPUs are reportedly planned for the 2027-2028 window, with co-packaged optics expected to be part of the platform.

That matters because co-packaged optics is one of those boring-sounding data centre technologies that can become very important when AI clusters get massive. Faster, more efficient chip-to-chip and rack-scale communication is crucial when companies are trying to train and run increasingly huge models.

Wccftech also notes analyst chatter that AMD could potentially land a major Anthropic-related deal with its future MI400 series, though that remains in the “watch this space” zone for now.

Why Malaysian and SEA readers should care

At first glance, server CPUs and AI accelerators sound like something only hyperscalers and cloud engineers care about. But this stuff does trickle down.

If AMD becomes a stronger second force against NVIDIA in AI hardware, it could mean better supply, more competition, and eventually more affordable cloud compute. That matters for Malaysian startups, universities, local AI builders, game studios, and even esports or creator-tech platforms running AI tools in the background.

For gamers and PC builders, this is also worth watching because AMD’s data centre success affects the whole company’s roadmap. More revenue from EPYC and Instinct gives AMD more room to fund next-gen CPU and GPU development. That does not automatically mean cheaper Radeon cards tomorrow, but stronger competition at the top end usually helps the broader tech ecosystem over time.

There is also a China angle. Wccftech reports that AMD is estimated to hold around 12% share in China’s AI GPU market this year, officially ahead of NVIDIA there because export restrictions have severely limited NVIDIA’s presence. That said, the report also notes that some research firms may still be accessing NVIDIA AI chips through unofficial channels, so the real-world picture may be messier than official market share numbers suggest.

AMD is set to report its first-quarter 2026 earnings soon, and expectations are clearly running hot. If the numbers match the analyst optimism, AMD’s AI server push may no longer look like a side quest — it could become one of the company’s main storylines for the next few years.

Source: Wccftech Gaming

Tags

AMDAI GPUsEPYCData CenterMalaysia Tech