Data centers built to deliver artificial intelligence (AI) services are springing up as fast as Wall Street can finance them. Meanwhile, these sprawling digital workhorses are inhaling electricity (putting pressure on already stretched grids) and consuming massive amounts of water for cooling. Earthly limits loom.
Not so in space: Up there, data centers can feed on continuous solar power and simply radiate heat into the void; furthermore, orbital centers could allow high-performance computers to sit right next to the satellites that collect the raw information, helping reduce data-processing time. Hence Elon Musk’s decision in early February to combine SpaceX, his spaceflight company, with xAI, his AI startup—a tie-up, Musk says, primarily driven by the growing pressure to put data centers in space.1
In our view, space may well be the next AI frontier (if not the final one). Here’s a quick look at the burgeoning orbital-AI sector, including key drivers, players and investment risks.
The Next Frontier
Orbital AI aims to solve a few pressing structural needs, starting with the relentless demand for data-processing speed.
Modern spacecraft now collect torrents of data—radar, hyperspectral, optical—but sending all that raw information back to Earth has become a significant choke point. Orbital data centers aim to relieve that pressure by co-locating data collection and computation: Letting the “eyes” sit next to the “brains” allows data to be filtered, compressed and interpreted before it reaches the ground, saving precious seconds when tracking troop movements, flood lines and supply-chain disruptions.
The need for speed is one driver. Easing pressure on our planet’s increasingly taxed power and water supplies is another:
- Power: In the U.S. alone, terrestrial data centers (not including cryptocurrency mining) consumed approximately 4.4% of total electricity consumption in 2023; some projections have that figure shooting to 12% by 2028 as AI demand continues to escalate.2
- Water: Cooling a typical data center requires 300,000 gallons of water per day (about 1,000 households worth),3 but large centers may need 5 million gallons, equivalent to a 50,000-person town.4 Those demands will only rise as more centers come online.
We believe orbital systems have potential to address both of these challenges by tapping near-continuous solar power and radiating heat into space, so that even a single 40-megawatt cluster (enough to power a mid-size city for a day) in orbit could be meaningfully cheaper to run over a decade than an equivalent facility on Earth.5
Meet Some of the Players
Orbital data centers may be a small slice of the AI pie for now, yet we expect them to add a new layer of infrastructure spending to complement the buildout on the ground.
For investors, we believe the emerging orbital-AI stack breaks into three buckets: 1) launch specialists, 2) hardware and software makers, and 3) chipmakers.
Launch Specialists
These companies do the heavy lifting by hoisting data-center modules into space. Other launch specialists are more vertically integrated: Rocket Lab, for one, designs and produces satellites; builds key subsystems like solar panels and laser communication terminals; and plans to launch them into low Earth orbit. The company claims that its reusable Neutron rocket is designed to carry loads of up to 13,000 kilograms.6
Software / Hardware Makers
These companies build and operate orbital data centers, applying trained AI models to live data streams so that only the most useful information gets transmitted. Starcloud, part of Nvidia’s Inception program, has already trained an AI model in space, moving the idea from concept to early pilot.7 Planet Labs, another key player, in our view, is teaming with Google to develop orbital hardware that can deliver processed images with single-meter resolution.8
Chipmakers
These companies supply the computational horsepower, with Nvidia firmly at the head of the pack. The AI-chip juggernaut is already embedding its GPUs across various orbital pilot programs, and we believe any sustained shift toward orbital AI would only expand the company’s broader addressable market.
Reaching Escape Velocity
While the new space race is on, we believe the transition to orbital AI still faces some formidable challenges, both technical and regulatory.
On the technical side, start with the fact that radiation from the sun and deep space can scramble or degrade chips over time. “Hardening” orbital-AI systems—be it through shielding, circuit design or more robust materials—remains mission-critical, and the industry still needs to prove that high-end AI chips can operate reliably for years under these harsh conditions.
Thermal management is another engineering challenge. (While GPUs generate a lot of heat, satellites can move from sunlight to darkness in minutes.) Ensuring long-term system performance means finding a way to stabilize temperatures and avoid sudden thermal swings.
Achieving scale is a third technical hurdle, given the sheer weight of hardware needed for optimal data-center “clustering”—that is, linking multiple computational nodes so they work together as a single unit. And finally, servicing all that hardware once it’s in orbit is not as easy as merely driving over to a center, rolling up shirt sleeves and getting to work.
On the regulatory front, we believe changes in spectrum policy could hamper orbital data-center deployment. Orbital centers will rely on radio frequencies and optical links to communicate with ground stations and other satellites, but those frequencies are regulated and often crowded. Securing and coordinating enough spectrum—and maintaining high-bandwidth optical links to major cloud-computing regions—is a non-trivial challenge, in our view.
Countdown to Launch
While the industry clearly has its work cut out—hardening radiation-rugged GPUs, launching scalable (and serviceable) computing clusters, securing reliable high-speed data links and solid partnerships with hyper-scalers, among other challenges—we believe orbital AI has the potential to ease significant terrestrial bottlenecks, extend the AI infrastructure stack and give investors a fresh way to participate in the long-term AI theme.
The countdown to launch, in our view, has clearly begun.