
At its Advancing AI 2025 conference in San Jose, AMD CEO Lisa Su projected the AI chip market will surpass $500 billion by 2028. The company expects over 60% annual growth, up from a $45 billion market in 2023. Su emphasized the shift toward AI inferencing, which is expected to drive the next wave of demand. AMD also unveiled its MI350 Series GPUs and showcased several new AI-focused technologies. Seven of the world’s top ten AI customers, including Reliance Jio, are already deploying AMD Instinct accelerators. A clear signal of AMD’s growing momentum in the fiercely competitive AI processor space.
MI350 Series GPUs and Developer Cloud Access Highlight AMD’s Strategy
AMD’s newly launched MI350 Series GPUs are designed to boost AI inferencing performance, offering up to 40% more tokens per dollar compared to earlier models. These chips, set for commercial availability in Q3 2025, boast a fourfold improvement in compute power over the previous generation. The MI350 is already in production and being used by major clients, including Meta, Microsoft, Oracle, and OpenAI. Su highlighted that AMD’s approach is rooted in open architecture, aiming to foster rapid innovation by giving developers early access. To support this, AMD announced a Developer Cloud Access Program that enables hands-on experimentation with AMD’s AI offerings.
Su emphasized that openness speeds up breakthroughs, pointing to historical examples of developer-driven innovation. This stands in contrast to Nvidia’s more closed ecosystem. With rising demand for AI hardware, AMD is positioning itself as a flexible alternative to Nvidia, especially as chip shortages continue. AMD also introduced the Helios AI Rack Scale, a fully integrated AI platform built to simplify deployment at scale. The system will be commercially available starting in 2026. With a strategic focus on inferencing, infrastructure, and developer engagement, AMD is targeting the most lucrative segments of the AI market. While differentiating through ecosystem openness and performance per dollar.
AMD Challenges Nvidia’s Dominance With Openness and Efficiency
While Nvidia dominates the AI GPU market, AMD is gaining traction by targeting gaps in availability, pricing, and flexibility. Su acknowledged Nvidia’s market strength but positioned AMD as the platform of choice for customers seeking faster returns and open development frameworks. With chip scarcity driving longer wait times for Nvidia products, AMD is appealing to hyperscalers and telcos alike. AMD’s MI350 Series and Helios platform are built for real-world deployments and optimized for AI inferencing. Seen by many as the next revenue frontier after model training.
Seven of the ten largest AI customers now use AMD Instinct Accelerators, according to Su. Reliance Jio, India’s largest telecom provider, is among those integrating AMD technology into their infrastructure. Su’s conversation with OpenAI’s Sam Altman and tech leaders from Meta, Oracle, and Microsoft during the conference underscored AMD’s growing influence in enterprise AI. In addition to performance, AMD’s focus on developer support may further increase adoption. The newly launched cloud platform offers early access to tools and infrastructure. A move AMD believes will drive faster iteration and innovation. Su reiterated the company’s belief in democratized AI development and pledged ongoing investment in accessible, scalable compute power that doesn’t sacrifice performance for affordability.
MI350 and Helios Set AMD’s Roadmap for AI Infrastructure Leadership
AMD’s roadmap hinges on powerful, scalable AI chip like the MI350 and integrated solutions like Helios. The MI350 series is tailored for inferencing efficiency, while Helios provides a rack-scale, plug-and-play platform for enterprise deployment. Both are aimed at closing performance gaps and removing AI infrastructure friction. AMD’s commitment to open architecture, combined with cloud access for developers, positions it as a nimble alternative in an Nvidia-dominated market. With key customers already onboard and broad ecosystem support, AMD is set to challenge incumbents by offering speed, openness, and cost-efficiency. Critical factors as global demand for AI compute accelerates over the next three years.