
British chip designer Arm Holdings forecasts that its share of data center CPU sales will jump exponentially this year, from approximately 15% in 2024 to a high of 50% by the end of 2025.
Arm, which makes chip designs used by many other vendors, claims that the energy efficiency of its designs—now a pressing concern given the enormous energy demand surge caused by AI—will drive this major increase in sales.
While some industry observers might see this bold forecast as merely a marketing effort, knowledgeable sources opine that the prediction is not unreasonable. “The trend is no doubt in that direction,” said Richard Gordon, vice president and practice lead, Semiconductors, The Futurum Group. Gordon did note, however, that because it’s hard to ascertain the base numbers and exactly how the projection is being measured, commenting on exact numbers by year end is difficult—yet overall, the market is indeed progressing as Arm is forecasting.
The reasons behind Arm’s forecast jump in sales have to do with the data center sector’s reliance on x86 chips; these x86 processors, made by AMD and Intel, have historically been the core engine of large data facilities. Yet Arm designs weren’t used in x86 chips—Arm’s strength has been in chips for small devices like tablets, gaming consoles and embedded systems. Its designs power virtually all smartphones.
ARM chips’ ability to power devices with low energy has proven to be a superpower in the age of artificial intelligence (AI). As the energy demands of AI have strained data centers, ARM’s energy-efficient designs are now enjoying a huge upswing in interest. This transition process has been slow owing to large enterprises having had to revise legacy software code, and often replace aging hardware to accommodate Arm designs.
Now in 2025 the data center industry appears to have reached a turning point, particular as the “big three” of cloud computing have embraced Arm designs.
Cloud giant Amazon Web Services (AWS) in December said that its ARM-based designed CPUs for data center use will make up more than 50% of the capacity for chips it’s added since 2023. Microsoft is using Arm-based chips in its own data centers and for its AI-based Copilot+PCs. Most notable in terms of high performance is the company’s ARM-based Azure Maia AI Accelerator.
Google developed its Arm-based CPU Axion for its data centers, and now uses them to support AI workloads. Google’s first virtual machine using the Axion was touted as offering 60% better energy efficiency and 65% better price-performance over x86-powered deployments.
Impressively, the Fugaku supercomputer, which held bragging rights as the world’s fastest supercomputer from 2020 to 2022, uses an ARM processor.
Arm, for its part, credits this year’s forecast sales increase to the adoption of the Arm Neoverse, a chip design initiative launched in 2017 to “target the full breadth of the infrastructure market.” Arm marketed the Neoverse as offering best-in-class performance from “Cloud to Edge,” which was a far different market than its traditional small device focus.
“AI servers are set to grow by more than 300 percent in the next few years, and for that to scale, power-efficiency is no longer a competitive advantage—it is a baseline industry requirement,” said Mohamed Awad, SVP and GM of the Infrastructure Business, Arm. “Today, we’re designing datacenters in gigawatts, not megawatts—and in that world, power-efficiency defines profitability. This is the same power-efficiency that has been part of the Arm DNA for the past 35 years.”