Quick Read
AMD (AMD) expects server CPU TAM to exceed $120 billion by 2030 as the CPU-to-GPU ratio in data centers shifts from 1:8 toward 1:1, with EPYC server CPU revenue up 50% YoY and targeting over 50% market share, while Instinct GPU Data Center revenue hit $5.77 billion up 57% YoY with Meta committing to 6 gigawatts of MI450 deployment. NVIDIA (NVDA) generated $62.31 billion in Data Center segment revenue in Q4 with 169x forward P/E and $10.98B in networking revenue last quarter, while Intel (INTC) grew Data Center and AI revenue 22% to $5.05 billion, trailing AMD’s faster growth rate.
As AI workloads shift from training-heavy (GPU-dominant) to inference-heavy with agentic systems (CPU-balanced), AMD captures market share gains against Intel in CPUs while Instinct chips compete for GPU workload share against NVIDIA’s entrenched CUDA software moat.
The analyst who called NVIDIA in 2010 just named his top 10 stocks and AMD wasn’t one of them. Get them here FREE.
Lisa Su’s pitch on the Q1 call had a specific shape. She’s arguing that the ratio of CPUs to GPUs inside an AI data center is about to collapse. “…in the past, the CPU to GPU ratio was primarily just as a host node in like a 1:4 or 1:8 configuration node, now changing and getting closer to a 1:1 configuration or even.” That’s the spine of the bull case for Advanced Micro Devices (NASDAQ:AMD), and it reframes how investors should think about the AI buildout. For two years, the AI trade has been almost entirely a GPU trade, and almost entirely an NVIDIA trade. Su’s argument is that the next phase looks different: heavier on inference, heavier on agentic workflows, and therefore heavier on the CPU side of the rack where AMD already has a structural lead.
The ratio math, walked through slowly
Training a frontier model is GPU-heavy. Running inference at scale with agentic systems spawning sub-tasks is much more CPU-balanced. Su now expects the server CPU TAM to grow at greater than 35% annually, reaching over $120 billion by 2030, roughly double her November guidance. AMD already owns the share-gain story against Intel (NASDAQ:INTC) in EPYC, with server CPU revenue up more than 50% YoY and Su targeting greater than 50% share of that market. Intel’s own Q1 numbers underscore the dynamic: Data Center and AI revenue grew 22% to $5.05 billion, respectable but still trailing AMD’s $5.78 billion Data Center print, which itself grew 57% year over year. The directional gap between those two growth rates is the clearest evidence that EPYC’s share-take is still compounding.
Story Continues
Layer the Instinct GPU business on top. Data Center revenue hit $5.77 billon, up 57%, with Q2 guidance of approximately $11.2 billion in total revenue, roughly 46% YoY growth. Meta committed to up to 6 gigawatts of AMD Instinct GPU deployment, with the first gigawatt running custom MI450 silicon. Su now sees a path to tens of billions of dollars in annual Data Center AI revenue in 2027 and an EPS run rate exceeding $20 over the strategic window. You can read the Q1 release at the SEC filing. Free cash flow of $2.57 billion in the quarter, up 253% year over year, gives AMD the balance-sheet headroom to fund the MI450 ramp, the Samsung HBM4 collaboration for MI455X, and the Helios rack-scale program with Tata Consultancy Services without diluting shareholders.
The analyst who called NVIDIA in 2010 just named his top 10 stocks and AMD wasn’t one of them. Get them here FREE.
Why this could close the gap with NVIDIA
NVIDIA (NASDAQ:NVDA) carries a market cap of $5.514 trillion, against AMD’s roughly $730 billion. NVIDIA’s Q4 Data Center segment alone did $62.31 billion, more than ten times AMD’s. The gap is enormous. But if Su’s ratio thesis is right, AMD harvests two compounding tailwinds. The CPU TAM doubles, and AMD’s share climbs past half. Meanwhile inference economics let Instinct chip away at GPU share workload by workload, particularly where ROCm has reached parity.
AMD shares are already up 109.32% year to date and 314.62% over one year, so the market has started pricing something in. The Q2 guide of roughly $11.2 billion at the midpoint implies 46% year-over-year growth, an acceleration from Q1’s 37.9%, and the kind of sequential ramp that historically precedes multiple expansion rather than compression. Hyperscaler adoption is the other tell: AWS, Google Cloud, Microsoft Azure, and Tencent are all expanding 5th Gen EPYC instances, which is exactly the customer cohort that drove NVIDIA’s own data-center inflection in 2023.
What breaks the thesis
Two things. If inference stays GPU-dominated because CUDA’s software moat holds, the ratio never compresses and the CPU TAM expansion is smaller than Su projects. If NVIDIA’s Blackwell and Rubin cycles keep Instinct boxed into single-digit share of accelerators, the second leg of the trade never lands. Jensen Huang’s claim that “Blackwell sales are off the charts, and cloud GPUs are sold out” is not idle marketing.
NVIDIA’s FY2026 revenue of $215.94 billion and $96.58 billion in free cash flow give it the reinvestment firepower to keep the software and networking moat widening, especially with $10.98B in Data Center networking last quarter alone. The forward multiple is steep at P/E of 169x, so any execution slip on the MI450 ramp gets punished. Watch the Q2 server CPU result, which management expects to grow more than 70% YoY, and the MI450 shipment cadence in the second half. Those are the tells. If both come in at or above plan, the “AMD as big as NVIDIA” framing stops sounding rhetorical and starts sounding like a base case worth underwriting.
The analyst who called NVIDIA in 2010 just named his top 10 AI stocks
This analyst’s 2025 picks are up 106% on average. He just named his top 10 stocks to buy in 2026. Get them here FREE.