Nvidia remains the king of artificial intelligence (AI) infrastructure.

Broadcom has a huge opportunity with its custom, application-specific chips.

With HBM demand soaring, Micron is poised to be a huge AI infrastructure winner.

10 stocks we like better than Nvidia ›

The biggest driving force in the market continues to be artificial intelligence (AI), which is why now can be a great time to pick up shares of some of the top companies in the space. Let’s look at three AI stocks to buy now.

The AI infrastructure boom shows no signs of slowing down, as evidenced by Taiwan Semiconductor Manufacturing making the decision to greatly ramp up its capital expenditures this year. Fabs (chip manufacturing facilities) need to be running near full capacity to be profitable, so this was not a step taken lightly. The foundry (known as TSMC for short) needed evidence that AI chip demand was not a passing fad, and it was convinced that this demand is here to stay.

That’s great news for Nvidia (NASDAQ: NVDA), which is a large TSMC customer. Its graphics processing units (GPUs) are the primary chips used to run AI workloads, and its main manufacturing partner just signaled that AI infrastructure demand is set to be a long-term secular trend. With about a 90% market share in the GPU space, Nvidia continues to be one of the companies best positioned to benefit from the continued AI data center construction.

Artist rendering of AI chip.

Image source: Getty Images

While Nvidia’s dominance in AI infrastructure is unquestioned, Broadcom (NASDAQ: AVGO) is also making great strides by helping hyperscalers (owners of large data centers) make custom AI chips to handle some of their AI workloads. The company is a leader in ASICs (application-specific integrated circuits), providing the building blocks to help customers’ designs become physical chips that can be manufactured at scale.

ASICs are preprogrammed chips that lack the adaptability of GPUs, but they tend to deliver strong performance for the tasks they’re designed for and are more energy-efficient. This becomes increasingly important with AI inference, which is an ongoing cost.

Broadcom helped Alphabet develop its highly successful tensor processing units (TPUs), and it’s set to benefit from increased TPU deployments as Alphabet is starting to let large customers order the custom chips to be deployed through Google Cloud. Anthropic has already placed a $21 billion order with Broadcom for TPUs to be delivered this year.

Several other companies are working with Broadcom to design their own custom AI chips, including OpenAI. Citigroup analysts have projected that Broadcom’s AI revenue could increase fivefold in the next two years from just over $20 billion this past fiscal year to $100 billion. That’s huge given that the company just recorded a bit under $64 billion in revenue for fiscal 2025.

Story continues