Barely a month after announcing a Department of Energy supercomputer project with Dell, NVIDIA Corp. was at it again Tuesday. It said it is teaming with Hewlett Packard Enterprise Inc. and the Leibniz Supercomputing Centre to build a new supercomputer using NVIDIA’s next-generation Vera Rubin chips.

The Blue Lion supercomputer, which uses HPE’s next-generation Cray technology and NVIDIA’s GPUs, is expected to be available to scientists in early 2027 — its latest entrant in NVIDIA’s push to persuade scientists to use artificial intelligence (AI) to analyze a few precise calculations and derive accurate predictions from them.

The new supercomputer system will be roughly 30 times more powerful than Germany’s current flagship machine SuperMUC-NG, according to NVIDIA.

During a press conference, NVIDIA also introduced its “Climate in a Bottle” AI model that would allow scientists to take the results of sea surface temperatures and extrapolate a forecast of the next 10 to 30 years across the planet’s surface. The model is part of NVIDIA’s Earth-2 platform that blends AI, GPU acceleration and physics-based modeling. Its goal is to generate precise atmospheric data faster and more energy efficient than traditional simulation methods.

“Researchers will use combined approach of classic physics and AI to resolve turbulent atmospheric flows,” Dion Harris, head of data center product marketing at NVIDIA, said. “This technique will allow them to analyze thousands and thousands more scenarios in greater detail than ever before.”

Separately, NVIDIA said Jupiter, another supercomputer using its chips at the German national research institute, is now Europe’s fastest system.

Speaking of fast, last month Lawrence Berkeley National Laboratory announced supercomputer Doudna, an NVIDIA-Dell collaboration using Vera Rubin chips to deliver what the lab claims is significantly ramped-up calculations. The lab expects the new machine to be 10 times faster than the lab’s most powerful system, and potentially be the DoE’s biggest resource for training AI models and other tasks when it debuts in 2026.

Before it became an AI titan, NVIDIA has tried to persuade scientists to use its chips to speed up complex computer problems like modeling climate change. The German supercomputer reflects NVIDIA’s broader push into science-focused applications for AI and high-performance computing beyond the data center and enterprise cloud.

TECHSTRONG TV

Click full-screen to enable volume control
Watch latest episodes and shows

Tech Field Day Events

SHARE THIS STORY