“Agentic AI represents a new class of users- autonomous systems that are always on, constantly analysing, learning and making decisions,” Su said during her keynote at Advancing AI 2025.
“This is driving a step-function increase in the need for compute across the board.”
As a result, these agents, AI systems that can act independently, are rapidly increasing demand for inference.
She said: “It’s like adding billions of new power users to the internet, each one consuming compute 24/7 and to support them, we need a massive leap forward in GPU and CPU performance.”
To address that, the technology giant introduced the new MI350 series, supporting models with up to 520 billion parameters on a single GPU and delivering 4x the performance of the previous product.
“We now believe inference will be the largest driver of AI compute,” Su said. “Our MI300 family is already being deployed at scale by top model builders, including OpenAI, Meta and Microsoft.”
Additionally, AMD is also tapping open ecosystems, branding it as a key advantage.
“Open ecosystems matter,” Su stated. “They allow faster innovation, broader adoption, and more flexibility- exactly what enterprises and sovereign AI programs are asking for.”
Meanwhile, she added that the company is now working with over 40 sovereign AI initiatives around the world.
“Nations are choosing AMD because we offer performance, flexibility and openness without lock-in,” she stated.
“We’re entering the next chapter of AI,” Su concluded. “With the MI350 series, we’re delivering leadership performance across the most demanding models and we’re just getting started- we’re already deep in development of MI400 for 2026,” she concluded.
RELATED STORIES
AMD unveils AI roadmap through 2027 with new advancements
OpenAI CEO warns of soaring compute needs as AI becomes 'critical infrastructure'