While many tech giants are busy constructing walled gardens around their AI empires, AMD is betting on a different strategy: tearing the walls down. Anush Elangovan, VP of AI Software at AMD, believes that in the current era of AI transformation, the only sustainable competitive advantage is not exclusivity, but velocity.
Elangovan frames this moment not as a mere software update, but as a fundamental shift comparable to the industrial revolution. "It's transformational at how you generate electricity. It's transformational at how you build data centers... It's transformational at what kind of use cases you use AI for," he observes.
This perspective shapes AMD's strategy. They view AI infrastructure not as a proprietary product to be hoarded, but as a foundational utility—like electricity—that requires broad, open collaboration to scale. This article explores Elangovan's philosophy on why "speed" is the ultimate moat, how open ecosystems create a self-reinforcing flywheel, and why interoperability is the key to unlocking the next generation of AI.
Speed as a moat
In the hardware world, "speed" usually refers to benchmarks and clock cycles. However, AMD has redefined this concept. They identify their true moat not as raw computational throughput, but as the organizational agility to innovate continuously.
"When I say speed is the moat, it is about how we prepare, how we build the muscle to run the race for a long time and run it fast," Elangovan explains. This definition expands the metric of success from a single product launch to the entire innovation lifecycle. It is about the ability to consistently deliver both hardware and software advancements at scale, anticipating where the industry is heading rather than just meeting current requirements.
To build this muscle, organizations must shed traditional assumptions. As Elangovan puts it, the goal is to "shed all things that hold you back and run as fast as you can." In an AI landscape defined by rapid, unpredictable change, the ability to adapt quickly is far more valuable than the ability to lock customers into a static ecosystem.
The innovation flywheel
AMD’s commitment to openness is not just philosophical; it is a practical mechanism for acceleration. By embracing open source, they create a self-reinforcing "innovation flywheel" where the pace of development outstrips what any closed system can achieve.
Elangovan compares this collaborative dynamic to the scientific community. "The reason that you publish scientific literature is that it moves the industry forward," he notes. One researcher cites another, experiments are replicated and refined, and collectively, "we move the ball forward like two millimeters."
In a closed system, progress relies on a single team. In an open ecosystem, that effort is multiplied, with numerous contributors pushing in the same direction to create a momentum that benefits every participant. This allows frameworks like vLLM and SGL to develop features at a breakneck pace, responding to industry shifts faster than proprietary alternatives could hope to match.
Acceleration through open source
This philosophy is embodied in ROCm, AMD's open-source AI software stack. The goal is to solve the most painful friction point for enterprise AI: the difficulty of moving workloads between hardware vendors.
The impact of this approach is visible in the ease of customer adoption. Elangovan notes that initial skepticism often gives way to surprise when customers realize they can migrate complex workloads without changing a single line of code. This level of compatibility turns what is traditionally a painful, resource-intensive migration into a seamless transition, removing the technical barriers that typically lock enterprises into a single vendor.
This zero-friction transition is the "killer feature" of open source. Furthermore, tools like the AMD Developer Cloud democratize access, ensuring that innovators without direct hardware access can still contribute to the platform. This supports what Elangovan calls "the last mile of AI"—ensuring that powerful solutions are actually accessible to solve everyday problems.
Interoperability over exclusivity
At the heart of AMD's strategy is a commitment to interoperability. Rather than creating proprietary systems designed to lock users in, AMD focuses on standardization that allows software to function seamlessly across different hardware implementations.
Elangovan draws a sharp contrast between systems designed for exclusive gain and those built for collective progress. While closed systems are often intentionally restricted to capture value, AMD chooses to share its work permissively, ensuring that the next generation of developers can build upon today’s foundations rather than reinventing them.
This creates an environment where innovation is driven by shared knowledge rather than artificial scarcity. It gives customers the flexibility to transition workloads without penalty, prioritizing performance over vendor loyalty. Elangovan’s thesis is simple but radical: "Let innovation flourish and let common knowledge, frontier knowledge be shared so that innovation moves forward for everyone."
The long game of open innovation
AMD is playing a long game. While proprietary walls may offer short-term profits, Anush Elangovan and his team are betting that the future of AI belongs to those who can innovate the fastest—and that the fastest way to innovate is together.
By redefining speed as an organizational muscle, leveraging the collective power of the open-source community, and prioritizing interoperability, AMD is building an engine for progress that is harder to replicate than any single piece of silicon. In the race to build the electricity of the 21st century, they are ensuring the grid remains open to everyone.
For the full story on how open ecosystems drive AI progress, listen to Anush Elangovan discuss these ideas in depth on the Dev Interrupted podcast.




