
The data center is heading for new worlds previously imagined only in science fiction. Pressured by AI’s voracious appetite and power constraints on the ground, operators and visionaries are testing three frontiers at once: the seabed, Earth orbit, and a nuclear reboot on land. None is risk-free, each faces considerable obstacles, yet all are moving faster than many expected.
Cooling with Currents: Submerging the Server Farm
China is taking the lead in commercial underwater deployments. Chinese maritime company Highlander Digital Technology plans to sink new pods off Shanghai, building on an earlier Hainan trial. The plan’s strategy is to swap energy-hungry land-based data center cooling systems for the Pacific’s free heat sink. Company leaders say ocean currents cut cooling energy by roughly 90%, with electricity sourced primarily from nearby offshore wind. The company targets about 95% renewables for the site.
Similarly, Microsoft’s Project Natick ran sealed capsules off Scotland through 2020 and reported far lower hardware failure rates than a comparable land installation. But after retrieval, the company paused commercial rollout, noting that access, maintenance, and a rapid server refresh cycle complicate an underwater business case.
Environmental questions are concerning: even modest thermal plumes could change local ecosystems, a problem that worsens with power density. Still, if operators accept that ocean pods will serve niche workloads and pair them with green power, the model can complement, but not replace, traditional data facilities.
The Orbital Gambit: Space as the Ultimate “Greenfield”
Jeff Bezos has put a rough timeline on space-borne data centers: within 10–20 years, he argues, we’ll build gigawatt-class facilities in orbit. In space, of course, solar power in space is continuous—no clouds, no night—and heat can be radiated into vacuum.
The problem is latency. Even low Earth orbit implies 20–40 ms round-trip, and geostationary can stretch past half a second. That rules out interactive workloads but leaves room for batch-oriented tasks like AI model training.
The engineering bar is formidable. A gigawatt of compute must shed a gigawatt of heat through radiator systems that are several orders of magnitude larger than anything now in orbit. Logistics require heavy-lift rockets and autonomous robotic servicing.
Still, early pathfinders exist: HPE’s Spaceborne Computers on the International Space Station have trialed edge storage and recovery in microgravity, and startups are prototyping compact space compute nodes. Also encouraging, investors are positioning for a space-support supply chain, with early money flowing into warehousing and manufacturing on Earth aimed at building off-planet infrastructure.
Even if the physics and launch costs can be worked out, it appears orbital data centers will be selective tools. They’ll likely be confined to workloads where power density and energy price dominate, and where added latency is tolerable.
The Nuclear Return: Powering AI at Grid Scale
Far more realistic, and far more actively in development, data center developers are pushing nuclear power as the foundational source to power the AI boom. In particular, there’s hope that small modular reactors (SMRs) and revived legacy plants can deliver low carbon, 24/7 power adjacent to expanding data facilities.
The challenge here is that these plants can’t be built overnight. For instance, in Michigan, the Palisades plant—decommissioned in 2022—is slated to restart with federal financing, with plans to add two SMR-300 units on site by the early 2030s.
Nationally, polling shows support for nuclear has climbed over the past decade, and insurers point to improved engineering and risk modeling. Big Tech, as usual, isn’t waiting on consensus. Microsoft agreed to purchase power linked to a proposed Three Mile Island restart, and Google is working with a next-gen nuclear vendor through the Tennessee Valley Authority.
Still, there’s a big hill to climb. SMR timelines are debated, regulatory frameworks are still optimized for large reactors, and early projects have faced cost overruns and cancellations. Even with momentum, wide deployment will be measured in years, not quarters. On the plus side, compared with building out new renewables at the speed AI demands power, colocated nuclear has become a credible path, if only for a subset of hyperscale facilities.
The Capital Question
Perhaps the most science fiction element to all of this is how much it will cost. Whether it’s millions of square meters of solar in orbit, pressure-rated subsea capsules, or first-of-kind SMRs, up-front capex demands a heavy toll for lower operating costs and better carbon math later. The only companies with pockets that deep are the few elite hyperscalers, Amazon, Microsoft, Meta, Google.
For the rest of the market, the near-term move is more prosaic: contracts that tie data center growth to dedicated generation, finding locations that will allow a large facility—increasingly difficult as local citizens rise in NIMBY (not in my backyard) resistance to data centers.
What’s new here is the generational transformation wrought by AI. It’s likely that AI’s growth will force computing to chase abundant energy and efficient heat rejection, so that tomorrow’s availability zones will span ocean tranches, reactor fences, and—sooner than many expect—the near reaches of deep space.