Investing.com — The rise of agentic AI is fundamentally restructuring data center architecture, moving beyond simple GPU-centric training toward complex workload orchestration. According to UBS analyst Timothy Arcuri, this shift is expected to catalyze a significant expansion in the server CPU Total Addressable Market (TAM).
The bank estimates the server CPU TAM could grow roughly fivefold by 2030, rising from a $30 billion baseline in 2025 to approximately $170 billion. This trajectory is supported by expert calls indicating that agentic deployments generally require a 3x to 5x increase in CPU cores per user compared to traditional workloads.
While the expanding market serves as a “tide that lifts all boats,” the Arm Holdings ADR (NASDAQ:ARM) instruction set appears poised to capture a disproportionate segment of new growth. Arcuri notes that ARM’s unit share is expected to reach 40% to 45% by 2030, a sharp increase from the 15% share estimated for 2025.
This transition is driven by hyperscalers prioritizing power efficiency and high-density scaling for agentic “head nodes.” Consequently, UBS has raised its price target for ARM from $175 to $245, reflecting an increased long-term EPS CAGR of 37%.
Advanced Micro Devices Inc (NASDAQ:AMD) is expected to benefit significantly from its established strength in high core count and multithreading, which are critical for scaling parallel subagents. Arcuri suggests that while Intel Corporation (NASDAQ:INTC) aims to close the performance gap with its upcoming Coral Rapids, its most immediate upside may reside in the “spillover” effect on the PC market.
Agentic AI tools are increasingly pushing workloads to run locally on end-user devices to utilize “free” compute capacity and reduce cloud latency. This trend is expected to catalyze a major PC upgrade cycle, providing a secondary growth engine for both x86 providers.
The technical requirements of agentic inference are driving a “step-function” increase in the attach rate of CPUs to accelerators. Expert insights gathered by UBS suggest that while traditional training requires 8 to 12 cores per GPU, agentic systems may require 80 to 120 cores per GPU.
This demand stems from the need for a “sandbox” environment for every individual subagent an AI spins off to complete a task. Arcuri observes that this complexity is forcing a shift from 1-to-4 CPU-per-GPU configurations toward 1-to-2 or even 1-to-1 ratios in the coming years.
The financial implications of this shift include a meaningful acceleration in Central Processing Unit Average Selling Prices (ASPs). High-end AI CPUs, such as NVIDIA’s 144-core Grace or AWS’s 192-core Graviton 5, are expected to command prices ranging from $3,000 to $4,000 per unit.