
IBM and AMD are teaming up to explore quantum-centric supercomputing, a hybrid approach that combines IBM’s quantum systems with AMD’s high-performance CPUs and GPUs. The goal: split complex workloads so each task runs on the architecture best suited to it—quantum for simulating nature at the atomic level, classical HPC for large-scale data crunching.
The IBM and AMD partnership isn’t a product launch, but it is a clear signal that two of computing’s biggest vendors see momentum building for practical quantum-classical workflows.
In separate statements, IBM CEO Arvind Krishna touted the partnership as a way to “push past the limits of traditional computing,” while AMD CEO Lisa Su called high-performance compute “the foundation” for tackling hard problems. The companies said they plan an initial demonstration later this year that shows IBM quantum systems operating in tandem with AMD hardware.
A key ambition behind the collaboration is developing fault tolerance. Today’s quantum systems are notoriously fragile and error prone, which is a major problem keeping them from mainstream adoption. IBM has set a target of delivering fault-tolerant machines before 2030. AMD’s accelerators (high-performance data center GPUs) could provide the control and error-mitigation work in a real world system.
The union of quantum and classical computers offers huge potential. In drug discovery, for instance, quantum processors could simulate molecular interactions that classical computers can’t handle, while classical systems can capably manage data pipelines, model training, and result ranking.
This hybrid approach is already showing progress: IBM has linked its modular System Two to Japan’s Fugaku supercomputer, and worked with partners including Cleveland Clinic and Lockheed Martin to test combined workflows. On the classical side, AMD’s silicon has earned bragging rights by powering the world’s two fastest systems on the TOP500 list, Frontier and El Capitan, demonstrating the company’s ability to support quantum-adjacent workloads.
IBM and AMD plan to focus on an open ecosystem by using Qiskit, an open source quantum computing framework developed by IBM. The plan is to offer an interface that enables developers to create hybrid algorithms with minimal effort. This open approach will likely lower the barrier for companies to experiment with hybrid-quantum adoption, promoting lower costs and greater adoption for this nascent technology.
Momentum in quantum is uneven but shows definite progress. McKinsey projects the quantum technology sector could zoom upward from roughly $4 billion in 2024 to as much as $97 billion by 2035, with computing comprising the majority. Meanwhile, as reported in Techstrong.it, debate continues over timelines. Nvidia CEO Jensen Huang earlier voiced skepticism about near-term utility, then later acknowledged an “inflection point.” Alphabet’s Google Quantum AI recently highlighted error-reduction progress with its Willow chip, encouraging advocates of quantum that system-level advances are compounding.
The IBM-AMD partnership’s focus on “hybrid” instead of purely quantum may prove to be strategically smart—though it’s not guaranteed. Joining quantum and classical has its own challenges. The two companies will need to show that end-to-end latency, scheduling, and error-budgeting is steady under real application load. And while corporate pilots are growing, customers require solid ROI paths, not roadmaps alone.
The bottom line: IBM and AMD don’t need to produce results immediately. They’re betting that tight integration across quantum hardware, accelerators, and open software will make hybrid workflows useful sooner, and that enterprise buyers will reward vendors who make that complexity manageable. The demo later this year will be one to watch: it will offer major clues about when hybrid quantum adoption will go mainstream.