Something is starting to feel off at the AI-energy nexus.

Certainly, any tech market in the midst of exponential ascendancy will experience growing pains. Most previous tech cycles have had their booms, where good ideas are met with overinvestment, too-high valuations, and calls for a light regulatory touch and a “strategic priority” from the government. 

But as we’ve arrived at the third anniversary of ChatGPT and the corresponding launch of the AI era, this is starting to look like something different — and it’s worth considering if it’s going as well as it should. The implications for our energy system are profound and, as we’ve noted here on many occasions, it’s not yet established whether the assumptions that this market’s forecasts are predicated on will hold.

Let’s look at the five pillars of AI’s case for massive investment and government support:

AI, a new general-purpose technology based on large language models, will underpin the next era of the digital economy, adding intelligence to digital work, and accelerating productivity across the economy. 

LLMs are the basis for artificial general intelligence, an ultimate expression of AI, and they improve by scaling compute — which requires scaling energy infrastructure.

Gains in efficiency beget only more gains in adoption, so demand for compute and power remains on an upward trajectory.

We are in a race with China, so time spent deliberating regulations, environmental permits, or waiting in interconnection queues is time lost to an adversary. 

There won’t be one winner of the AI race; the technology will be fundamental to every major tech company’s strategy. 

Together, these have led all the major tech players and a range of “neo-cloud” companies to invest in data center development at an unprecedented pace. This started as a digital infrastructure story, but quickly became the dominant energy infrastructure story of the moment. The electricity sector is now firmly yoked to the ambitions of the tech sector, though they remain strangers to each other.

AGI strains credulity

This notion that the current crop of LLM-based AIs is on a straight path to AGI and will get there by scaling compute has always been debated, but this week, it feels particularly wobbly. 

Gary Marcus, who is the human embodiment of LLM skepticism, took to his Substack to mark the third anniversary of ChatGPT with a pretty scathing post about its potential, or lack thereof, to ever achieve AGI. These arguments feel academic much of the time, but as the U.S. market continues to digest forecasts of over 100 GW of data center load growth, the foundational belief around AI is incredibly important to stress-test every so often.

As we’ve mentioned before, Azeem Azhar’s dashboard for monitoring the state of the AI market has been a good guide over the past month to answer whether we’re in an AI bubble or not. With gauges like industry strain, revenue growth, and valuation heat,the dashboard gives us a sense each day of how confident we should feel about the state of the AI market. Lately, the gauges are moving in the wrong direction, with the news around complex debt financing, DeepSeek’s latest model, and cybersecurity risks driving more indicators into the yellow or red.

When it comes to the question about whether LLMs are the right tool for the job, the evolving criticisms of LLMs, scaling laws, and forecast productivity gains could be aggregated into a new gauge that measures “credulity strain.” Gary Marcus clearly puts that one in the red, believing the current crop of models can’t really achieve AGI because they don’t have a “causal model of the world” from which to reason — by which he means understanding the principles governing the information gathered in a way that goes well beyond pattern matching. This deficit, he argues, is inherent to LLMs, so scaling, training, and fine-tuning won’t fix it. That means that the industry is recklessly supporting the needs of a handful of tech entrepreneurs instead of taking our time to get it right.

This is big. If you don’t get the AI you hoped for, you can’t dramatically improve productivity in the economy, and ultimately, you don’t get a return on investment. If you don’t have returns, the market collapses, data center construction pauses, and here in the power sector, stranded assets and misallocated capital are everywhere.

Social licenses are easily revoked

Sticking with the gauge metaphor, of late, local news is pushing “social strain” well into the red. The promises of tech titans rarely go over as planned, and with AI often being equated with job destruction, they are increasingly a hard sell in the communities where data centers are being built. 

And if you’re a utility, grid operator, or IPP looking to capitalize on this new era of load growth, this matters hugely. Because data centers are now as much energy infrastructure as they are digital infrastructure, the path to sustained growth runs through townships, planning boards, and permitting authorities.

Inside Climate News had perhaps one of the best headlines of the year in “A new unifying issue: Just about everyone hates data centers.” (Wired comes in second with “The data center resistance has arrived”). It’s a sobering thing to hear when you spend most of your time in the energy market, where data centers represent an opportunity to invest in new clean energy technologies like enhanced geothermal, grid-enhancing technologies, long-duration energy storage, and VPPs. 

But the news coming out of many states hasn’t been looking good for the past few weeks:

The $7 billion-plus Stargate data center in Saline Township, Michigan, was met with opposition, and the state’s Attorney General has asked the utility commission to slow their approval of two contracts that would allow the utility DTE to provide power to the project.

In rural, and often red, Pennsylvania, Talen is facing significant opposition to its contracts with data centers, as residents worry about its impacts on electricity rates, the environment, and farmland.

The firm Data Center Watch found that at least 16 data center projects worth a combined $64 billion have faced delays or have been entirely blocked by local officials due to opposition.

In states like Georgia, Illinois, Tennessee, Missouri, and Idaho, officials are embracing the “pre-emptive data center moratorium” as a way to slow-roll or block new developments.

This isn’t just a U.S. problem. As data center developers look to Latin America, weighing locations such as Chile and Brazil, opposition has emerged as residents learn about relaxed environmental permitting and limited transparency on development impacts.

Having a social license to build and operate in a community is no small thing. Forecasts for load growth across the U.S. must factor in the ability of projects not just to get interconnected to the grid or obtain the supply of equipment and resources to build off-grid — but also to secure regulatory approvals and permitting. 

As long as electricity prices remain the new “price of eggs” and perceptions of job losses and environmental impacts endure, the backlash will continue, and executive orders won’t be sufficient to bypass every scenario. I would put the needle of this gauge moving well into the yellow, threatening red.

The race with China

The race with China is the reason given for AI’s strategic importance to the U.S., the reason the technology is worthy of special regulatory treatment and massive investment support. But if you were to ask any typical U.S. citizen just what the stakes of the AI race are, I’d be surprised if they were thinking in geopolitical terms at all. 

On the ground, China is not far behind, and some consider this race a dead heat. DeepSeek continues to release models that rival those from OpenAI and Google, while others in China have focused on developing more specialized models that don’t require training on the most advanced GPUs from Nvidia.

The race, in this light, appears to be a posture more than a truly strategic imperative. I fear it will continue to force the federal and state policy makers into difficult positions around supporting growth at all costs, when this should be a much more deliberative process. What is actually at stake? Has the Trump administration credibly articulated how this fits into other geopolitical priorities? 

Any reporting of “losing” the race with China risks rattling the markets in the U.S. and destroying a great deal of market value by perception alone. News of China’s increasing pace in the race tends to weigh negatively on Azeem’s “industry strain” gauge, as it threatens the viability of our current AI leaders. 

Final thoughts

Most of the discourse over the past few months has been around AI as a bubble and the implications for investors and the economy if it were to burst. With trillions of planned investment, that’s a critical conversation to have. 

But something about the past couple of weeks has me taking another step back. Previous bubbles had in common a better sense of what was actually being invested in, even if it ended up being overinvested in too early or too much given its long-term value. 

AI feels different. I love using Gemini 3; I get the potential. But the extreme confidence in this current mode — building massive data centers that require cities’ worth of power to gain incremental improvements in a technology whose creators can’t even agree on whether scaling like this works — feels particularly of this moment in the U.S.: macho, indifferent to local concerns, abstract to a fault, hyper-competitive, and peculiarly mistrustful of institutions and tech titans while granting them carte blanche to build, build, build.

A version of this story was published in the AI-Energy Nexus newsletter on December 3. Subscribe to get pieces like this — plus expert analysis, original reporting, and curated resources — in your inbox every Wednesday.