Anush Elangovan, AMD’s Corporate Vice President, highlights AMD’s commitment to empowering developers through the open and accessible ROCm software ecosystem, which supports diverse AI workloads across a wide range of hardware with a focus on performance, inclusivity, and ease of use. She emphasizes transparency, community collaboration, and streamlined user experiences, showcasing real-world demos and inviting developers to innovate freely within a robust, scalable AI platform.
In the video, Anush Elangovan, Corporate Vice President of AMD, addresses developers, creators, and innovators about AMD’s mission to empower developers through the ROCm software ecosystem. She highlights the broad deployment of AMD hardware, showcasing real-world applications such as Luma Labs’ Ray 3 running on MI325 GPUs, emphasizing AMD’s commitment to supporting diverse AI workloads including text-to-video, text-to-image, and large language models (LLMs). The core vision is to make AMD hardware and software accessible to everyone, removing barriers and vendor lock-in, and fostering an open, collaborative environment where developers can innovate freely.
Elangovan stresses the importance of transparency and community involvement in ROCm’s development. All ROCm code is publicly available, allowing developers to file bugs and contribute just as internal AMD engineers do. This open approach accelerates innovation by leveraging collective efforts rather than relying solely on AMD’s internal resources. She also highlights the ongoing improvements in the ROCm stack aimed at enhancing speed—from installation to debugging and profiling—making it easier and faster for developers to move from idea to execution without being hindered by software limitations.
A significant focus is placed on the user experience and community enablement. ROCm now offers a streamlined installation process, comparable to a simple pip install on Windows or Linux, and an open build system called “rock” that mirrors AMD’s internal development workflow. This transparency and ease of use are designed to encourage community contributions and rapid iteration. AMD is also committed to delivering day-zero support and performance, ensuring that developers can rely on ROCm to be both functional and performant from the moment they start using it.
Elangovan outlines several key pillars guiding ROCm’s development: performance, robustness, inclusivity, and accessibility. ROCm is deployed at scale in seven of the top ten frontier labs, demonstrating its capability to handle complex workloads. The software ecosystem is designed to be inclusive, allowing developers to innovate at their own pace, and accessible across a wide range of AMD hardware—from Ryzen laptops to discrete GPUs and Instinct products. This pervasive AI software layer ensures a seamless experience whether running locally or deploying on cloud platforms like AMD DevCloud or partner clouds.
To illustrate ROCm’s capabilities, Elangovan presents demos such as GPTOSS running on a Strix Halo laptop, delivering competitive performance with just a simple two-line command on Windows. Another demo showcases agentic coding running entirely locally on Ryzen processors, enabling developers to execute LLMs without internet access. She concludes by reaffirming AMD’s commitment to openness, collaboration, and continuous innovation, inviting the developer community to join in shaping the future of AI software with ROCm as a shared philosophy and platform.