Harrison Stoneham
Elon Wants to Put Data Centers in Space. The Argument Is Better Than You Think.

Elon Wants to Put Data Centers in Space. The Argument Is Better Than You Think.

9 min read

I was walking the dog and listening to the new Dwarkesh episode when Elon said something that stopped me in the driveway: “Mark my words, in 36 months, probably closer to 30 months, the most economically compelling place to put AI will be in space.”

I replayed it. He was serious.

My first reaction was the same as yours. But then I started thinking about why, and the argument isn’t really about space at all. It’s about electricity — where it comes from, how much we need, and the fact that we’re running out of places to get it.


Quick detour. We recently redid the electrical at our house and I was finally going to get an electric car. I was set on the Rivian. The investor day presentation was incredible, the tech looked genuinely impressive, and when I test drove it I was blown away by the build quality. Gorgeous car. They’d mapped 3.5 million miles of roadways for their hands-free driving system, which sounded like a lot.

I was ready to buy it. But the dealership said hey, just go test drive the Tesla before you decide.

I never touched the steering wheel. Not once. There were a couple moments where I genuinely wasn’t sure what the car was going to do — a weird intersection, a cyclist cutting across — and it just figured it out. During the drive I mentioned the Rivian’s 3.5 million mapped miles and the Tesla guy said, “Yeah, we’re at about eight billion.”

I did the math later. Rivian’s 3.5 million mapped miles vs Tesla’s eight billion driven miles. That’s not a little more. That’s roughly 2,300 times more driving data. The gap isn’t a gap, it’s a chasm.

I bought the Tesla. Not because I’m an Elon fan. Because nothing else was even close, and the reason nothing else was close is that Tesla has been collecting driving data for years at a scale nobody else can touch. That kind of gap doesn’t close easily.


I tell that story because it’s the lens I was listening through when Elon started talking about space. Not “is this guy crazy” but “is there something real here that people are writing off too quickly?”

His timelines are always off. Full self-driving was supposed to be done in 2020. But the guy went to Russia to buy a rocket, got laughed out of the room, and decided to just build his own. Now SpaceX is the backbone of the US space program. He’s catching rockets out of the air with a giant mechanical arm. He built a global satellite internet constellation that works in the middle of the ocean.

I’d bet he’s directionally right about space compute, just early on the timeline — as usual. The question isn’t whether it happens in 30 months. It’s whether it happens at all. And the more I think about the energy math, the more I think it does.


Here’s the setup. Building AI at the scale these companies are talking about requires an insane amount of electricity. Elon threw out a number: a terawatt of compute capacity would need roughly twice what the entire US currently consumes. The whole country. Times two. Just for AI.

You can’t build that fast enough. Power plants take years. Permits take longer. And the actual bottleneck — this is the part that stuck with me — is turbine blades. Turbine blades and vanes, these specialized cast components inside gas power plants. You can’t 3D print them. You can’t scale the foundries overnight. They’re the physical chokepoint that determines how fast we can add power generation on Earth.

Elon’s line was something like, “Those who have lived in software land don’t realize they’re about to have a hard lesson in hardware.” As an investor who’s been building with AI tools lately, that one landed.

Everyone in tech assumes everything scales if you throw money at it. Hardware doesn’t work that way. It has supply chains with 18-month lead times. It has guys in foundries pouring metal into molds. Anyone who’s done diligence on a manufacturing business knows — that doesn’t speed up because someone wrote a bigger check.


So: space. In orbit you have essentially unlimited solar power. No weather. No clouds. No atmosphere to filter anything out. And the solar panels themselves can be cheaper and thinner than what you’d use on Earth — you don’t need glass, you don’t need weatherproofing, you don’t need to engineer for wind loads. Just a thin film between you and the sun.

Dwarkesh pushed back and I thought his point was good — energy is only 10-15% of data center costs. The GPUs are the real expense. And you can’t send a technician to low Earth orbit when a rack goes down.

That last part is worth sitting with. What does replacing GPUs in orbit actually look like? Are you launching entire racks every time something fails? GPUs have maybe a 3-5 year useful life before the next generation makes them obsolete. That means you’re not just building data centers in space — you’re building a supply chain to space. Continuous launches, continuous hardware cycling, continuous maintenance at a distance where “send a tech” means “launch a rocket.” That’s a real problem, and I don’t think anyone has a clean answer for it yet.

But Elon’s counter was the part I keep thinking about. He said the issue isn’t the cost of energy. It’s the availability. There are GPUs sitting in warehouses right now that can’t be deployed because there’s nowhere to plug them in. The power just doesn’t exist yet. It doesn’t matter how cheap energy is per kilowatt-hour if you literally cannot get enough kilowatt-hours.

That reframes the whole question. It’s not “is space cheaper?” It’s “is space the only place where the power actually exists at the scale we need?”


Now — this whole idea depends on one assumption: that AI compute demand keeps compounding at current rates. If efficiency improves faster than expected, or the models plateau, the electricity crisis might solve itself before anyone needs to launch a rack into orbit. Microsoft is betting on nuclear. Others are betting on fusion timelines shortening. There are paths through the energy wall that don’t involve rockets.

I think demand is more likely to outrun efficiency, but that’s a bet, not a certainty. And if there’s one thing I’ve learned in investing, it’s to be honest about which parts of your thinking are assumptions.


And this is where the SpaceX-xAI merger starts to make sense. It’s not just a corporate thing. He’s putting all the pieces under one roof.

AI has proven it can write software. It clearly cannot build hardware. xAI builds the models. SpaceX builds the rockets. Starship puts the compute in orbit. Starlink provides the communication layer between users on the ground and GPU clusters in space. The solar infrastructure powers the whole thing without any of the constraints that throttle terrestrial energy.

I’m not sure anyone else has all of those pieces. Governments could get there eventually. China will try. Microsoft is going hard on nuclear as a terrestrial alternative. But right now, nobody else has the rockets, the satellites, the AI models, and the launch infrastructure all in one place. And the hard part isn’t money — it’s the knowledge you can only get by doing it for twenty years.


I keep thinking about TSMC. Going from 30 nanometer chips to 5 nanometer chips wasn’t a software update. It took decades of trial and error, process refinement, and accumulated expertise that can’t be written down or copied. That’s the real moat — not patents, not equipment, but knowledge that only exists inside the people and processes that built it. The US is spending billions trying to replicate it and finding out how hard that is.

Building rockets is arguably harder. The tacit knowledge required to reliably launch, land, and reuse orbital-class vehicles at scale — SpaceX has been building that for over twenty years. Blue Origin has been trying. They’re getting there. But the gap is enormous.

Jeff Bezos has been saying something similar for years, by the way. His whole idea with Blue Origin is that there’s no other viable place for humans to live besides Earth — “there’s no plan B” — so you have to get heavy manufacturing and energy production off the planet to preserve it. Earth as the residential zone, space as the industrial zone.

The interesting part isn’t that two billionaires agree. It’s that they got to the same place from totally different directions. Elon is thinking about where to put the computers. Bezos is thinking about how to keep the planet livable. And they both end up at: heavy industry moves off-planet, eventually, because there’s nowhere else for it to go.


I had an opportunity to invest in SpaceX a while back and passed. The valuation seemed crazy and I didn’t fully understand the business. I think about that one a lot. Not because I underestimated Elon — because I underestimated how much physical infrastructure would matter in a world that was supposedly going digital. I was thinking about the digital side. I should have been thinking about atoms.

The thing I took away from the episode isn’t really about space. It’s about the wall. We’ve been living in a world where the constraint on AI was algorithms, then data, then chips. And now it might just be electricity. The most boring, most physical, most un-software constraint imaginable.

We thought intelligence was abstract. Turns out it’s made of metal and heat. And someone has to build the power plants — or, apparently, the rockets.


Stood in the driveway for a minute after the episode ended, just thinking. The full Dwarkesh conversation with Elon is here. Worth the three hours.