The world’s richest man and the “Silicon Valley Iron Man,” Elon Musk, plans to take SpaceX public in 2026.
With an overall valuation of approximately $1.5 trillion (equivalent to about RMB 106 trillion) and a planned financing scale that will “significantly exceed $30 billion,” once realized, it will surpass Saudi Aramco’s IPO in 2019 ($29.4 billion) and become the world’s largest IPO.
To achieve the ambitious goal of a $1.5 trillion valuation, it can’t solely rely on the old dreams of Starlink and Starship in the “vast expanse of the universe.” Musk has come up with a new dream in due course: space computing power.
Others who share the same dream include Jeff Bezos, Sam Altman, as well as tech titans like Google, NVIDIA, and Amazon, and Silicon Valley giants. There’s no way around it; the infrastructure and electricity on Earth are too expensive.
On November 18th at the Baron Capital Annual Investor Conference, Musk publicly elaborated on the concept of “space AI computing power” for the first time. “Within five years, running AI training and inference in space will become the most cost – effective solution.”
Musk clearly pointed out that in Earth’s orbit, there is a never – setting sun that provides free electricity; the cosmic vacuum is the ultimate radiator; and the reusability of Starship will significantly reduce the cost of transporting materials to Earth’s orbital space.
However, to utilize the space and energy in space, more technological breakthroughs are needed for computing power in space, including radiation protection, heat dissipation, etc. This in turn gives rise to economic problems, while the original intention of sending computing power into space was to solve economic issues.
A senior aerospace researcher joked, “Putting users’ data under the raging solar wind, what they need is not a computer but a brain supplement.”
01 The Core of the New Narrative: Space Computing Power
The “space data center” proposed by Musk is not a science – fiction imagination but directly targets the two major bottlenecks of the current Earth’s AI infrastructure: energy cost and cooling cost.
The environmental characteristics provided by space make it a leapfrog alternative solution, mainly including a never – setting sun and the cosmic vacuum. The former is almost free energy, and the latter is a natural ‘ultimate radiator.’
As the scale of AI models expands rapidly, energy has become the greatest pressure on the computing power economy. Training a model like GPT – 5 may cost hundreds of millions of dollars, and the electricity cost accounts for a large proportion.

In contrast, low – Earth orbit space can enable space computing power to have a cost floor that can never be reached on the ground.
Firstly, there is a continuous supply of sunlight – Low – Earth orbit satellites orbit the Earth about 16 times a day. Through orbit distribution and inter – satellite link scheduling, tasks can be switched in real – time to nodes in the illuminated area, achieving nearly 24 – hour energy supply.
Secondly, there is no need for land, power grids, or substation facilities. Once space photovoltaics are deployed, the long – term marginal electricity cost approaches zero.
Energy and infrastructure become cheap in space, but the heat dissipation problem has to be solved. The heat density of high – performance GPUs is extremely high, resulting in 30% – 40% of the electricity in ground data centers being consumed for cooling.
In space, due to the lack of convection and conduction in the vacuum environment, heat is dissipated only through thermal radiation. For satellites on the sun – facing side, this is equivalent to a ‘barbecue mode.’ Coupled with space rays and high – energy particles, they may disrupt the high – performance computing process.
“The computing power chips you use on the ground will probably not work in space,” the aforementioned researcher emphasized.
Another key to supporting SpaceX’s space computing power story is the transportation cost of Starship.
The economic viability of space computing power is based on the extremely low launch cost of Starship. Musk’s planned Starship V3 (with a single payload of about 100 tons) can reduce the freight rate to $200 – 300 per kilogram after high – frequency reuse, far lower than that of existing rockets.
Based on this estimate, the transportation cost of sending a 200MW – class orbital data center into space is about $5 – 7.5 billion, still far lower than the overall investment required to build an AI supercomputer of the same level on Earth ($15 – 25 billion).
The story is always beautiful, but the reality presents many challenges.
In addition to the heat dissipation and cosmic rays mentioned earlier, the first problem that the proponents of the “space computing power” narrative have to consider is rocket explosions and accidents. The cost brought by such low – probability events will far exceed the value brought by the improvement in transportation capacity. After all, 1GW of computing power on Earth costs up to $50 billion.
From this perspective, everything comes back to the “economic problem.”
Regarding heat dissipation, although it may be possible to use the low temperature in deep space to alleviate it, this also gives rise to a problem – the long distance requires another more efficient energy source, such as nuclear energy, to replace solar energy.
As for the radiation problem, although it can be solved through radiation – hardened reinforcement, this adds additional technological challenges and costs on top of conventional GPUs.
“The materials and processes for radiation – hardened reinforcement are very expensive, and there is an upper limit to the reinforcement. No matter what, it can’t be as effective as the radiation protection of the Earth’s atmosphere,” the aforementioned researcher said.
To address this problem, SpaceX is leveraging Tesla’s experience with automotive – grade AI chips: the triple modular redundancy (TMR) architecture and real – time verification significantly improve radiation resistance.
02 Silicon Valley Capital Backers “Get on Board”
SpaceX’s new narrative of space computing power is receiving a strong response from Wall Street and Silicon Valley capital.
In a research report released recently, investment bank Morgan Stanley emphasized that the market’s re – pricing of SpaceX’s valuation essentially comes from the further expansion of its business boundaries. The “orbital data center” has become a new AI infrastructure narrative driving its valuation to soar.

Analysts pointed out that this concept not only explains why the market is willing to value SpaceX in the way of a high – growth software company but also explains why its valuation has doubled in half a year.
Wall Street has given a positive response to the new story that Musk has “created” for SpaceX. The most representative example is Ark Investment led by “Cathie Wood.”
Ark clearly values SpaceX entirely as a high – growth software company and an AI infrastructure company in its latest model, rather than a traditional aerospace or telecommunications enterprise. In its model, almost all of the newly added valuation comes from the “space AI computing power” business line.
As a senior investor in Musk, they made a radical scenario assumption: by 2030, the number of Starlink end – users will reach 120 million, contributing $300 billion in annual revenue; the orbital data center will bring an additional $80 – 120 billion in revenue, and the net profit margin will exceed 70%, far higher than that of ground – based cloud services.
Ark Investment believes that the key is to reduce the Starship launch cost to less than $100 per kilogram. After that, the large – scale deployment of space computing power will experience exponential growth, just like Amazon’s cloud computing service AWS after its launch, thus propelling SpaceX to become a $2.5 trillion – level enterprise.
On the other hand, Peter Thiel, a Silicon Valley tycoon, has a more strategic influence on SpaceX.
As one of the earliest and most crucial external investors, Thiel “came to the rescue” by investing $20 million through the Founders Fund in 2008 when SpaceX failed three consecutive launches and was on the verge of bankruptcy. He has since made multiple additional investments in SpaceX.
Thiel’s support for SpaceX goes far beyond money itself. It comes from the authoritative endorsement of the Silicon Valley ideological system. Peter Thiel’s involvement is equivalent to endorsing the “space computing power” narrative. More importantly, Thiel uses his “Silicon Valley – Washington” influence to secure key policies and regulatory space for SpaceX.
The support of these capital tycoons is equivalent to putting a “credible” label on SpaceX’s $1.5 trillion valuation. Therefore, Morgan Stanley pointed out in the aforementioned report that although Musk denies the rumored “valuation of $800 billion,” he is more denying the “financing behavior” rather than the valuation itself.
The progress of Starship and Starlink, the acquisition of global direct – connect cellular spectrum, the emergence of the orbital data center as a new narrative, and SpaceX’s overwhelming ability to account for 90% of the global launch mass make the capital market believe that these variables will become the “main artery of future AI infrastructure.”
03 Competition, Bubbles, and Risks
In the past few years, Musk is not the only one promoting the development of space computing power.
As early as 2021, the Barcelona Supercomputing Center (BSC) in Europe and Airbus Defence and Space jointly launched the GPU4S (GPU for Space) project, funded by the European Space Agency (ESA), to verify the feasibility of embedded GPUs in aerospace applications.
The project launched an open – source benchmark suite, GPU4S Bench, to evaluate the performance of image processing, autonomous navigation, etc., and produced an open – source benchmark test suite, OBPMark, which was adopted by the ESA, laying the foundation for Europe to achieve technological autonomy in the field of orbital computing.
On November 2, 2025, SpaceX’s Falcon 9 successfully launched the first test satellite, Starcloud – 1, of the startup Starcloud into low – Earth orbit. The satellite is equipped with NVIDIA’s H100 GPU, verifying the in – orbit AI data processing capability for the first time. It is reported that NVIDIA and Starcloud jointly developed a joint vacuum heat dissipation architecture, which conducts heat to the surface through high – thermal – conductivity materials on the outside of the satellite and dissipates heat in the form of infrared radiation.
As Musk’s long – standing rival in the commercial space race, Jeff Bezos is also promoting Blue Origin to develop orbital AI data center technology, planning to use space solar energy to power large – scale AI computing power.
He predicts that within the next 20 years, the cost of orbital data centers may be lower than that of ground facilities.
Another competitor in the AI field, Sam Altman, the CEO of OpenAI, is also eager to try. He is considering acquiring the rocket company Stoke Space to send AI computing payloads into space.
It can be said that there are many challengers on SpaceX’s “space computing power” path to a $1.5 trillion valuation, but the most direct competitor at this stage is Project Kuiper under Amazon.
Public information shows that Kuiper plans to deploy 3,200 satellites between 2026 and 2029 through a launch agreement with Blue Origin. Its greatest advantage lies in AWS’s global cloud ecosystem, which can provide enterprises with a “ground + space” hybrid computing power product.
But in Thiel’s view, Kuiper still belongs to the “extension of traditional cloud,” and the idea of relying on Earth’s infrastructure has not changed. While SpaceX’s space AI is a completely new paradigm: moving the computing power center itself to orbit and making the Earth’s data center a supplementary layer. This paradigm difference determines the ceiling of both sides in the future “orbital cloud” competition.
In addition to many challenges such as technology and competition, the issue of regulation also cannot be ignored – including orbital debris management, international spectrum coordination, and disputes over space militarization, which may affect SpaceX’s progress rhythm in the next two or three years.
Looking back at SpaceX: From the “Starlink IPO” in 2019, to the “Starship is the core asset” in 2022, and now the “space data center will become the cheapest AI computing power,” Musk has rewritten SpaceX’s narrative three times in the past six years, and almost every time, what was once questioned has become a real – world prototype.
Now, he has bundled Tesla’s chip capabilities, xAI models, Starlink bandwidth, and Starship’s transportation capacity into a unified strategy, aiming directly at the most expensive resource in the AI era – low – cost computing power.
Wall Street has started to place bets – the question is, can the richest man’s new version of the “space dream” come true again?
This article is from the WeChat official account “Tencent Technology,” written by Su Yang and Wu Ji, and published by 36Kr with authorization.