AMD CEO Lisa Su announced the launch of new AI hardware, including the MI355 inference accelerator, emphasizing their competitive edge and commitment to affordable, efficient AI solutions amid fierce industry competition. She highlighted AMD’s strategic focus on open ecosystems, ongoing product development, and strong industry partnerships to capture growth in the rapidly expanding AI market.
At AMD’s advancing AI event in San Jose, CEO Lisa Su announced the launch of their new AI hardware and systems, emphasizing their competitive stance against industry giants like Nvidia. She highlighted the introduction of the AMD MI355 chip, which she described as the best inference accelerator on the market, offering 40% more tokens per dollar. This improvement aims to make AI more affordable and accessible globally, reflecting AMD’s focus on increasing efficiency and cost-effectiveness in AI workloads.
Su also discussed AMD’s broader product roadmap, including their 2026 lineup of AI accelerators and rack-scale systems. The company is committed to an annual cadence of releasing new hardware, supporting not just chips but comprehensive systems designed for data centers and production workloads. This strategy aligns with their goal of providing end-to-end solutions that cater to the growing demand for AI infrastructure across various industries.
The CEO emphasized AMD’s strong position in the AI ecosystem, noting that seven of the top ten AI model builders and companies, including Meta, Oracle, OpenAI, Tesla, and others, are using AMD products. She explained that these customers require the latest hardware to deploy AI models efficiently at scale. To support this, AMD is investing heavily in software development to simplify deployment and is leveraging their ZT systems acquisition to offer rack-scale solutions, enabling faster time-to-market for AI applications.
Su acknowledged the intense competition from Nvidia and the major cloud providers like AWS, Azure, and Google, which develop their own AI chips to optimize costs and margins. She articulated AMD’s strategy of maintaining openness and flexibility, emphasizing the importance of co-developing hardware and software with partners. This approach aims to deliver highly programmable and adaptable architectures that can serve a diverse range of AI workloads, positioning AMD as a versatile player in the rapidly expanding AI market.
Finally, Su reiterated her belief that AI is still in its early stages, with the market expected to grow over 60% annually to reach over $500 billion by 2028. She highlighted the significant performance improvements AMD has achieved, with their latest hardware delivering three times the performance of last year’s models and future generations promising tenfold enhancements. AMD’s focus on open ecosystems, innovation, and close collaboration with partners underscores their strategy to capture a substantial share of the burgeoning AI industry.