AMD vs NVIDIA: The Underdog’s Comeback Story
For years, AMD has been chasing NVIDIA’s shadow — and always falling short.
Both companies make GPUs, yet when the world turned toward Artificial Intelligence, every major AI lab, startup, and research institute chose to side with Jensen Huang’s NVIDIA, leaving Lisa Su’s AMD almost invisible in the AI race.
But that’s starting to change.
Recently, OpenAI signed a massive partnership — not with NVIDIA — but with AMD.
Yes, that AMD. The one everyone said could never compete in the AI era.
This deal, reportedly worth tens of billions, could reshape the entire GPU landscape.
But before we talk about what changed, let’s rewind: why was AMD so far behind NVIDIA in the first place?
The Rise of CUDA — NVIDIA’s Secret Weapon
At first glance, both companies built GPUs. So why did NVIDIA dominate while AMD lagged?
In Q2 2025, NVIDIA owned 94% of the GPU market. AMD? A mere 6% — and that number was falling.
The reason? CUDA.
CUDA — short for Compute Unified Device Architecture — sounds technical, but its impact was revolutionary. Released in 2007, CUDA allowed programmers to use languages like C++ and Python to access NVIDIA’s GPUs for non-graphics tasks. In simple terms, it made GPU computing developer-friendly.
It became the foundation for AI research, simulations, and machine learning frameworks.
CUDA wasn’t just software; it was an entire ecosystem. And once researchers and engineers adopted it, switching became nearly impossible. NVIDIA didn’t just sell hardware — it sold the standard.
AMD’s Different Game — and Its Cost
While NVIDIA looked toward the distant horizon — artificial intelligence — AMD was busy fighting Intel in the CPU market and building chips for PlayStation and Xbox.
That’s where AMD won. But NVIDIA was playing a completely different game.
In its 2016 Shareholder Report, NVIDIA wrote:
“Deep learning breakthroughs have sparked the AI revolution. Progress is exponential. Adoption is exponential. The impact to the tech industry and society will also be exponential.”
That was years before ChatGPT, Claude, or Gemini even existed. NVIDIA had already seen the AI storm coming — and they built for it.
By the time AI took off, researchers didn’t just need GPUs — they needed NVIDIA GPUs.
AMD’s Growth and Sudden Collapse
AMD’s revenue shot up during the early AI boom — from $9.7 billion in 2020 to $23 billion in 2022. But then, everything began to unravel.
Profits plunged from $3.7 billion in 2021 to $1.2 billion in 2022, and then down again to $600 million in 2023.
The message was clear: AMD wasn’t the answer for AI.
A $49 Billion Gamble
In 2022, AMD made its boldest move yet — acquiring Xilinx for $49 billion in stock.
It was the largest chip deal in history. Xilinx specialized in FPGAs and adaptive computing — key technologies for data centers and cloud AI.
But there was a problem: investors weren’t thrilled.
AMD’s stock plummeted from $143 in late 2021 to $84 by mid-2022.
Even so, Lisa Su pushed forward.
Then, in October 2023, AMD bought Nod.ai, an open-source AI software company. It was clear: AMD was throwing everything at catching up with NVIDIA.
Yet by 2025, the gap only seemed to widen.
Enter the MI300X — AMD’s Counterattack
Instinct MI300X Entered in the market— AMD’s most ambitious GPU yet.
Built specifically for AI workloads, it packed 5.3 TB/s of memory bandwidth, around 60% higher than NVIDIA’s H100.
It even offered 40% better latency on certain large language model (LLM) tasks.
Performance-wise, AMD was finally back in the fight.
Even more impressive? The cost-per-token — the real metric for AI workloads — was lower than NVIDIA’s, especially at large scales.
That’s exactly what major players like OpenAI, Microsoft, and Meta care about.
For the first time, AMD wasn’t just competing on performance — it was competing on value.
But the Hype Fizzled
Analysts expected the MI300 series to bring in $8 billion, but estimates soon dropped to $5 billion.
When the real numbers came in, the entire Instinct line — not just the MI300X — made around $5 billion total.
Investors weren’t impressed.
As one analyst put it:
“AMD’s AI roadmap isn’t as competitive as we thought.”
Part of the problem was AMD’s software ecosystem — ROCm. It was open-source, flexible, and powerful — but it wasn’t CUDA. CUDA was simpler, better documented, and already everywhere. So even with better hardware, developers hesitated to switch.
A Market on the Brink
By 2024, AMD was desperate to win back confidence.
NVIDIA’s near-total dominance (94% market share) created a different kind of problem: supply constraints.
Even older NVIDIA chips like the H100 and H200 were nearly impossible to get.
This opened a tiny window for AMD.
Cloud provider Vultr announced it would offer AMD Instinct MI300X GPUs on its platform.
It wasn’t just a sale — it was an opportunity for developers to finally try AMD’s architecture at scale.
And then, AMD dropped a bombshell.
The Turning Point: OpenAI
On June 12, 2025, at AMD’s “Advancing AI” event, Lisa Su revealed the next-gen Instinct MI350 series, and teased the MI400/MI450 generation.
And then came the real surprise — OpenAI was helping design it.
AMD’s Forrest Norrod said:
“OpenAI has given us a lot of feedback that heavily informed our design.”
Just months later, the announcement shook the tech world:
OpenAI’s next-generation AI infrastructure will run on AMD Instinct GPUs.
The deal, starting with the MI450 in late 2026, could be worth up to $100 billion over time.
For OpenAI — the crown jewel of the AI industry — to partner with AMD instead of NVIDIA was more than just business.
It was a signal.
AMD’s stock exploded, rising over 70% within days.
The Catch
As part of the agreement, OpenAI received a warrant for up to 160 million AMD shares — around 10% of the company — that vest based on milestones like GPU deployments.
It’s a bold, risky move.
Even NVIDIA’s Jensen Huang couldn’t resist commenting:
“I’m surprised they’d give away 10% of the company before even building the product. But it’s clever, I guess.”
He’s not wrong. If AMD fails to deliver, the cost could be catastrophic.
But if they succeed — this deal could redefine the future of AI hardware.
The Road Ahead
AMD remains the underdog.
Their MI300X has proven its power, and the OpenAI partnership gives them legitimacy.
But ROCm, their software ecosystem, still lags behind CUDA — the real moat NVIDIA built nearly two decades ago.
For AMD to truly compete, it’s not just about making faster chips.
It’s about building a developer-first ecosystem that makes AI as accessible on AMD as it is on NVIDIA.
The clock is ticking.
The future of AI hardware might not be decided by performance — but by accessibility, scale, and trust.
For now, AMD has taken the first real shot at breaking NVIDIA’s monopoly.
Whether it’s the start of a new era — or just another false dawn — only time will tell.
If you enjoyed this deep dive, follow StoryAntra for more untold stories of innovation, rivalry, and the people shaping our digital future.






0 Comments