Featured content

AI across AMD’s entire portfolio? Believe it!

A little over a year ago, AMD CTO Mark Papermaster said the company’s strategy was to offer AI everywhere. Now learn how AMD, with help from Supermicro, is bringing this strategy to life.

  • March 12, 2025 | Author: Peter Krass
Learn More about this topic

A year in the fast-moving world of artificial intelligence can seem like a lifetime.

Consider:

  • A year ago, ChatGPT had fewer than 200 million weekly active users. Now this Generative AI tool has 400 million weekly users, according to developer OpenAI.
  • A year ago, no one outside of China had heard of DeepSeek. Now its GenAI chatbot is disrupting the AI industry, challenging the way some mainstream tools function.
  • About a year ago, AMD CTO Mark Papermaster said his company’s new strategy called for AI across the entire product portfolio. Now AMD, with help from Supermicro, offers AI power for the data center, cloud and desktop. AMD also offers a robust open AI stack.

‘We’re Thrilled’

AMD’s Papermaster made his comments in Feb. 2024 during a fireside chat hosted by stock research firm Arete Research.

During the interview, CTO Papermaster acknowledged that most early customers for AMD’s AI hardware were mostly big cloud hyperscalers, including AWS, Google Cloud and Microsoft Azure. But he also said new customers are coming, including both enterprises and individual endpoint users.

“We’re thrilled to bring AI across our entire portfolio,” Papermaster said.                                                                          

So how has AMD done? According to the company’s financial results for both the fourth quarter and the full year 2024, pretty good.

Aggressive Investments

During AMD’s recent report on its Q4:24 and full-year ’24 financial results, CFO Jean Hu mentioned that the company is “investing aggressively in AI.” She wasn’t kidding, as the following items show:

  • AMD is accelerating its AI software road map. The company released ROCm 6.3, which includes enhancements for faster AI inferencing on AMD Instinct GPUs. The company also shared an update on its plans for the ROCm software stack.
  • AMD announced a new GPU system in 2024, the AMD Instinct MI325X. Designed for GenAI performance, it’s built on the AMD CDNA3 architecture and offers up to 256GB of HBM3E memory and up to 6TB/sec. of bandwidth.
  • To provide a scalable AI infrastructure, AMD has expanded its partnerships. These partnerships involve companies that include Aleph, IBM, Fujitsu and Vultr. IBM, for one, plans to deploy AMD MI300X GPUs to power GenAI and HPC applications on its cloud offering.
  • AMD is offering AI power for PCs. The company added AI capabilities to its Ryzen line of processors. Dell, among other PC vendors, has agreed to use these AMD CPUs in its Dell Pro notebook and desktop systems.

Supermicro Servers

AMD partner Supermicro is on the AI case, too. The company now offers several AMD-powered servers designed specifically for HPC and AI workloads.

These include an 8U 8-GPU system with AMD Instinct MI300X GPUs. It’s designed to handle some of the largest AI and GenAI models.

There’s also a Supermicro liquid-cooled 2U 4-way server. This system is powered by the AMD Instinct MI300A, which combines CPUs and GPUs, and it’s designed to support workloads that coverge HPC and AI.

Put it all together, and you can see how AMD is implementing AI across its entire portfolio.

Do More:

 

Featured videos


Follow


Related Content