ORBITAL DATA CENTRES FOR AI COMPUTING
Orbital data centres are satellite-based computing facilities designed to meet AI energy demands by leveraging space's abundant solar power and unique environmental conditions.
- Use solar energy in orbit for near-continuous power, reducing reliance on terrestrial grids and cooling resources.
- Employ radiative cooling methods due to the absence of air, requiring large radiator panels to dissipate heat.
- Utilize laser-based free-space optical links for high-speed satellite communication instead of physical cables.
- Rely on robotic assembly and maintenance with radiation-hardened hardware and software error correction for reliability.
Let’s start with an uncomfortable truth. The internet isn’t floating in a cloud. It’s sitting in very real buildings, chewing through electricity, water, and land at a rate that would make a small country blush.
As AI ramps up, those buildings are getting bigger, hotter, and thirstier. By 2030, data centres could be using close to 10% of the world’s electricity. Add in millions of gallons of water per day for cooling, and suddenly your “quick ChatGPT check” starts to feel like running a small industrial plant.
So naturally, the next logical step is… space.
Having spent over 30 years in digital marketing and watching infrastructure evolve from dial-up to AI-driven hyperscale, this feels like one of those inflection points where everything shifts at once.
Yes, really. Not a gimmick. Not a pitch deck fantasy. A serious, slightly mad, but increasingly practical idea.
From Server Farms to Satellite Clusters
The traditional data centre model is starting to creak. You need land, power, cooling, planning permission, and a tolerant local community that’s happy with a giant humming box next door.
Good luck with that.
Space flips the script.
No land constraints. No neighbours complaining about the noise. No water shortages. And best of all, a near endless supply of solar energy. In certain orbits, satellites can soak up sunlight almost 24/7. No clouds, no night, no British drizzle ruining your output.
Projects like the European Commission’s ASCEND study and Google’s Project Suncatcher aren’t just tinkering either. They’re exploring gigawatt-scale compute in orbit by the mid-2030s.
Which sounds bold. Until you look at AI demand and realise it’s the only way the maths works.
The Real Driver: Energy (and a Bit of Panic)
Strip away the shiny tech and the real issue is simple. Energy.
Training large AI models is expensive. Not just financially, but environmentally. On Earth, we’re juggling renewables, grid limits, and backup generators that politely pretend they’re not diesel.
In conversations with clients we are taking more notice of token costs in AI usage, energy is already the limiting factor. Not creativity, not demand. Just raw power and cost.
In orbit, the equation changes.
Solar panels perform far better without an atmosphere in the way. No interruptions, no intermittency, no need for vast battery farms the size of small villages. That means a much higher energy yield and, over time, dramatically lower running costs.
Some projections suggest energy costs could drop from around 5p per kWh on Earth to a fraction of a penny in space.
That’s not an optimisation. That’s a completely different game.
Cooling: Where Physics Has a Quiet Laugh
You’d think space would make cooling simple. After all, it’s cold. Very cold.
The catch is there’s no air. Which is a bit of a problem when your entire cooling strategy depends on… air.
On Earth, we move heat with air or liquid. In space, you have to radiate it away. Slowly. Patiently. Like waiting for your tea to cool, but with several megawatts of heat involved.
The physics here isn’t optional. It’s the constraint everything else has to work around.
To give you a sense of scale, a modest AI cluster in orbit could need radiator panels the size of a tennis court just to avoid turning itself into an expensive toaster.
Engineers are tackling this with some clever ideas. Heat pipes, two-phase cooling, deployable radiators that unfold like space origami. It’s elegant, slightly terrifying, and definitely not something you fix with a desk fan.
Connectivity: No Cables, Just Lasers
If your data centre is orbiting Earth, plugging in an Ethernet cable isn’t an option. Unless you’ve got a very long cable.
Instead, we use lasers.
Free-space optical links that fire data between satellites at ridiculous speeds. Faster than fibre, harder to intercept, and not currently bogged down in the usual spectrum red tape.
It’s basically turning space into one giant, very expensive local network.
What could possibly go wrong.
Can AI Hardware Survive Space?
Short answer: mostly.
Longer answer: it depends how much you like your memory behaving itself.
Radiation in orbit can cause bit flips, which is a polite way of saying your data occasionally decides to freestyle. Recent tests show modern AI chips are tougher than expected, but memory systems are still the weak link.
The current approach is refreshingly practical. Use standard hardware, add shielding, and let software clean up the mess.
Not perfect, but neither is anything in AI right now.
Robots Will Be Your IT Department
You’re not sending Dave from IT up to swap a GPU.
Orbital data centres will be built and maintained by robots. Modular components assembled in space, upgraded in place, and quietly fixed without anyone raising a ticket.
Think less “have you tried turning it off and on again” and more “it fixed itself while you were asleep”.
Swarm intelligence, autonomous docking, and self-managing systems are all part of the plan. Early versions are already being tested, which is both impressive and slightly unnerving.
The Legal Bit No One Wants to Talk About
Here’s where things get awkward.
Data on Earth is governed by national laws. Data in space? That depends on which country the satellite is registered to.
So your data might technically be “somewhere above Belgium, legally governed by Luxembourg, operated by a company in California”.
Simple.
This opens the door to digital sovereignty in a way we’ve never really seen before. Some groups are already exploring space as a way to protect sensitive data from external control.
Even software licences may need rewriting. “Worldwide” might soon need a quiet update to include “and slightly above it”.
The Environmental Trade-Off
Before we get too smug about saving the planet, there’s a catch.
Launching hundreds of rockets a year isn’t exactly subtle. Emissions in the upper atmosphere, potential ozone impact, and debris from re-entry all need to be managed.
The good news is the industry isn’t ignoring this. Cleaner fuels, reusable rockets, and better lifecycle planning are all in motion.
The goal is simple. Make the long-term gains outweigh the short-term hit.
Not perfect. But better than pretending the current model scales forever.
This isn’t really a space story. It’s an energy story wearing a space suit.
The £200/kg Moment
Right now, the biggest barrier is cost.
Getting hardware into orbit is still expensive. But like most things in space, it’s getting cheaper surprisingly quickly.
There’s a tipping point. When launch costs drop below roughly £200 per kilogram, space-based data centres start to make financial sense.
At that point, free energy, no land costs, and minimal ongoing resources become very hard to ignore.
We’re not quite there yet. But we’re circling it.
So, Is This Actually Happening?
Yes. Slowly, then all at once.
AI demand is climbing, terrestrial infrastructure is under pressure, and the economics of space are improving every year.
By 2035, the idea that “the cloud” lives in a warehouse might feel as outdated as dial-up.
Instead, it could be exactly what it sounds like. A network of solar-powered compute clusters quietly orbiting above us, doing the heavy lifting while we argue about prompts and pretend we understand them.
From where I sit, advising businesses on AI growth and infrastructure strategy, this isn’t a question of if. It’s timing.
And if that still sounds far-fetched, give it five years.



