Advanced Micro Devices used the CES stage in Las Vegas to sharpen its challenge to NVIDIA’s dominance of the artificial intelligence chip market, unveiling new data center hardware while outlining an aggressive roadmap aimed at meeting what it describes as a looming global compute shortfall.
The announcements, delivered by AMD CEO Lisa Su, centered on a broadened AI portfolio designed to serve both hyperscale customers and enterprises that want to deploy AI systems within their own facilities. The message was clearly competitive: demand for AI compute is accelerating faster than the industry’s ability to supply it, and AMD intends to claim a larger share of that growth.
On-Premises Accelerators
A key addition to the lineup is the MI440X, an enterprise-focused accelerator built for on-premises deployment in smaller corporate data centers. Unlike rack-scale systems engineered specifically for massive AI clusters, the MI440X is designed to slot into existing infrastructure, allowing businesses to keep sensitive data local while still running advanced AI workloads. AMD positioned the chip as a practical entry point for enterprises that want AI capabilities without overhauling their data center designs.
At the high end, AMD touts the MI455X accelerator and the forthcoming Helios rack-scale system, which the company is pitching as a direct competitor to NVIDIA’s latest NVL platforms. Helios systems will pair 72 MI455X GPUs in a single rack, matching the scale of NVIDIA’s most advanced offerings and highlighting AMD’s intent to compete not just on silicon, but on full-system design.
AMD is widely seen as the closest challenger to NVIDIA in AI accelerators, yet the gap remains substantial. NVIDIA continues to sell every AI chip it can manufacture, generating tens of billions of dollars in quarterly revenue. AMD’s AI business, while now multibillion-dollar in size, is still in an earlier phase of scaling.
Part of that effort rests on partnerships. During the keynote, Su was joined by OpenAI president Greg Brockman, who emphasized the importance of continued chip innovation to support the organization’s rapidly growing compute needs. AMD signed a deal with OpenAI last year that executives say will add billions of dollars in annual revenue over time, providing both validation of AMD’s technology and a foothold with one of the world’s most demanding AI customers.
The Roadmap
Looking further ahead, Su previewed AMD’s MI500 series GPUs, slated for a 2027 launch. The company claims the chips will deliver as much as a 1,000-fold increase in AI performance compared with its MI300 generation introduced in 2023. While such projections are not yet proven, Su argued that gains of that magnitude will be necessary as AI use expands to billions of daily users and pushes global compute requirements dramatically higher.
CES also provided a stage for AMD to show that its AI ambitions extend beyond data centers. The company demonstrated new Ryzen AI processors for PCs and showcased work in robotics, including a humanoid robot developed by Generative Bionics and powered by AMD hardware. The demonstrations reinforced AMD’s view that AI workloads will increasingly span cloud, enterprise, edge, and device-level computing.
For AMD, the stakes are high. Its stock has surged over the past year as enthusiasm around AI infrastructure spending has intensified. But sustaining that momentum will depend on whether customers see AMD as a compelling alternative in a market still defined by NVIDIA’s scale and execution.

