The numbers simply don’t add up. And that is enough to warrant a temporary departure from the standard Neural Dispatch format, for this week. On November 20, Nvidia reported number for the third quarter fiscal 2026. Record revenue of $57.0 billion, up 22% from Q2 and up 62% from a year ago, the tech giant says. “Blackwell sales are off the charts, and cloud GPUs are sold out. Compute demand keeps accelerating and compounding across training and inference — each growing exponentially. We’ve entered the virtuous cycle of AI. The AI ecosystem is scaling fast — with more new foundation model makers, more AI startups, across more industries, and in more countries. AI is going everywhere, doing everything, all at once,” says Jensen Huang, founder and CEO of NVIDIA. If you look at just this summary (as most of you would, with limited attention spans, you’d be impressed).

Nvidia has almost $19.8 billion worth of unsold chips, and that’s up significantly in 3 months Nvidia has almost $19.8 billion worth of unsold chips, and that’s up significantly in 3 months

But it was the Wall Street that read between the lines, and the Nasdaq as well as S&P 500 slid…and then kept sliding. A snapshot — Dow was down 0.8%, S&P 500 down 1.6%, and Nasdaq down 2.2% on the day. There was much more to Nvidia’s earnings, which I’ll attempt to summarise to save you time (and re-emphasise that to call this AI conversation a “bubble” wouldn’t exactly be out of place).

  • Nvidia has almost $19.8 billion worth of unsold chips (these would be primarily GPUs) sitting in warehouses, and that number is up significantly in three months — the inventory amount was around $10 billion then. There is a reason why I put Huang’s entire quote in the previous paragraph. Apparently, the official line is that demand is through the roof and supply is limited. You know it as well as anyone, shortage claims cannot be true, if there’s so much inventory sitting around. It’s common sense. The only question to be asked is — are customers simply not buying, or are they buying without actually having the money to buy? Nvidia though explains this as a build-up for future demand.
  • Nvidia has also reported that there are $33.4 billion in unpaid bills, or accounts receivable as the terminology goes, and that number is up89% in a year. This means customers who bought chips haven’t paid for them yet. The average payment window is now 53 days, instead of 46 days a year prior. That extra week represents a few more billion, and when they may arrive is anyone’s guess. A more concerning scenario for the chipmaker emerges if many of Nvidia’s customers start to behave like under-capitalised AI startups struggling to pay for hardware in case venture funding dries up or if the AI bubble deflates before these companies achieve profitability.
  • Over the past month, we’ve seen absolutely brash attempts at a circular economy to keep the AI bubble pressurised somehow. The latest in that chapter also featured Nvidia which is participating with its own $2.5 billion ‘investment’ in Elon Musk’s xAI which wants to raise $20 billion. I was reading somewhere, this would involve an interesting financing structure with a special purpose vehicle (SPV) that will purchase the Nvidia hardware, which xAI will then lease for a five-year term. This isn’t the first circular AI deal and neither will it be the last.

The bottomline is, the same pile of dollars are being circulated between different AI companies, hand holding each other in the hope that the bubble isn’t discovered, and gets counted as revenue at each corporate stop this wad of cash makes on its journey.

American investor and hedge fund manager Michael Burry made a rather blunt post X after Nvidia’s earnings release. He wrote, “The idea of a useful life of depreciation being longer because chips from more than 3-4 years ago are fully booked confuses physical utilisation with value creation. Just because something is used doesn’t mean it is profitable.” He pointed out that airlines keep old planes around and in service, which come in handy during the festive period rush, but they are only marginally profitable. The reality is, Nvidia’s CFO had pushed back on the GPU accounting (which I’ve explained above) and in a statement said the useful life of Nvidia’s GPUs is a significant total cost of ownership advantage over rivals — and points to A100 GPUs that were shipped 6 years ago still being utilised at full capacity by customers. But it isn’t that simple. The A100s consume as much as 3x more power per compute (the unit is FLOP, or Floating-Point Operations Per Second) than the H100s that followed it. And that in itself is approximately 25x less power efficient than Blackwell generation chips. A debate is raging — should depreciation be 3 years, 5 years, or 7 years? Compulsion more than choice?

THINKING

“We’re doing a 500 megawatts, gigawatts…It’s going to cost eight bazillion trillion dollars.” – Elon Musk, at the U.S.-Saudi Investment Forum 2025

With Elon Musk, it is difficult to know if he was genuinely confused, or it was just an artificially induced fog. But this was Musk, introducing xAI’s planned 500 MW AI data centre partnership with Saudi Arabia. And of course, this is powered by Nvidia, which is why CEO Jensen Huang was almost sweating when he said “stop it” as Musk stumbled between megawatt and gigawatt. Not a casual occasion to stumble, for the man who many believe is the saviour of humanity (of course that mission will also be powered by Nvidia, but I digress).

The Context: That is the whole AI bubble, condensed into one beautifully unhinged exchange. They thought no one would notice in the cloud of big numbers and excitement. A CEO who’s raising tens of billions for compute doesn’t know (or pretends not to know) the difference between megawatts and gigawatts. Understandably, the CEO of the world’s most valuable semiconductor company visibly panics because the quiet part — that no one really knows where this is going or how much it will cost — has just been said out loud, at an event filled with sovereign wealth funds. Has “fake it till you make it” morphed itself into “build it till the grid collapses and hope the ROI eventually materialises” for the AI era?

A Reality Check: We find ourselves amidst a moment where AI companies are committing trillions of dollars to data centres without a clear business model beyond “AGI will pay us back at some point.” Power costs worldwide are going up, as is the demand for water. Something simply has to give, at some point. AI companies and startups are being funded with billions to be ready to buy GPUs that don’t exist yet, to train models nobody knows how to monetise, for customers who aren’t sure why they need them.

Musk’s quote isn’t a joke — it’s as close as we’ll ever get to an accidental confession from the AI bros. The AI boom today is powered by physics, marketing and spreadsheets that print whatever number keeps the funding round alive. Nobody has any idea what true power requirements will be (that’ll after all depend on usage, and no one knows that too), how many chips are actually needed, or what the returns look like. And yet everyone keeps buying compute because everyone else is buying compute. This is how bubbles form, through collective delusion wrapped in technical jargon. And you can’t blame Jensen for sweating, because this is a market built on curating expectations, and the worst possible thing is someone admitting they… don’t know what they’re talking about.