Synopsis: Joe Capes, CEO of LiquidStack, explains why LiquidStack shifted from Bitcoin mining to advanced cooling technologies. The focus is on liquid cooling methods that improve efficiency for AI data centers. LiquidStack introduces the GigaModular, a scalable solution for AI workloads, addressing challenges in scaling infrastructure, power supply, and rapid deployment.
Liquidstack’s story began in 2012 with Bitcoin mining, where the company pioneered two-phase immersion cooling to squeeze efficiency out of power-hungry rigs. Over time, the focus moved beyond crypto into data centers and edge computing, expanding into single-phase immersion and direct-to-chip solutions. That pivot couldn’t have been better timed. Today, nearly 80% of large-scale AI deployments are liquid-cooled, and the industry is headed toward 100%. Simply put, air cooling can’t keep pace with racks running hundreds of kilowatts—and in some cases pushing past a megawatt.
That’s where Liquidstack’s latest platform comes in. The Giga Modular, announced at Data Center World Congress, is the first modular, scalable 10MW coolant distribution unit (CDU). It gives operators the ability to scale cooling capacity in line with their AI workloads, whether they’re standing up a few megawatts or rolling out hundreds. With rack densities climbing an order of magnitude in just two years, that kind of flexibility is no longer optional.
Capes notes that AI data centers—he calls them “AI factories”—aren’t cost centers but revenue engines. Every watt saved on cooling is a watt redirected to compute, and operators are racing to bring capacity online. From adaptive reuse of shuttered car plants to massive greenfield builds, the speed and scale are putting pressure on the entire ecosystem. For Liquidstack, that means doubling down on manufacturing in Dallas and abroad to keep pace.