AI, repatriation

For more than a decade, the public cloud has been the go-to environment for enterprises looking to scale infrastructure quickly and flexibly. But with the rise of AI workloads, that trend is shifting fast. Enterprises are discovering that many of the architectural and governance benefits that made the public cloud so appealing are now being tested and, in some cases, reversed. 

According to a recent white paper on cloud usage and management trends by GTT based on a survey of industry IT leaders conducted by Hanover Research, private cloud spending is now projected to grow nearly twice as fast as public cloud spending in 2025 among organizations with $10 million or more in cloud budgets. Even more telling, over half of AI workloads are already running in the private cloud or on-premises environments. A repatriation wave is quietly reshaping enterprise cloud strategies. 

Why AI is Changing the Cloud Equation 

AI workloads are unique. Unlike conventional applications, many AI use cases require high-throughput compute, low latency and strict controls over where and how sensitive data is stored and processed. From agentic AI that makes autonomous decisions to real-time inference at the edge, the demands of AI workloads are forcing enterprises to reassess their cloud architectures. 

The public cloud has clear strengths: Global scale, elasticity and speed to deploy. But it also introduces challenges that are magnified by AI: 

  • Data privacy and governance risks, especially when proprietary or regulated data is involved 
  • Unpredictable costs tied to compute usage, APIs and network traffic 
  • Latency and performance constraints that can hinder real-time inference and edge use cases 
  • AI factory design, designed to optimize AI applications for the entire AI life cycle 

These challenges manifest differently across industries. The AI-powered quality control systems used in manufacturing often require low-latency inference and local data processing at the edge. In financial services, institutions use AI for fraud detection and risk modeling, workloads that benefit from predictable costs, control and regulatory alignment. 

With private cloud and on-prem environments, organizations gain the greater control, predictability and customizability required to support such compute infrastructures.  

These environments are often better equipped to meet the performance and latency demands of AI, especially with regard to inference models, as these are best placed close to where data is generated or consumed.  

Repatriation Doesn’t Mean Regression 

Repatriation can be mischaracterized as a step backward. In reality, it reflects a more intentional and nuanced approach to workload placement. Rather than defaulting to the public cloud for everything, organizations are asking: What belongs where? Which workloads benefit from scale, and which demand sovereignty, security or cost stability? 

Cloud strategies are no longer just about resilience or vendor diversity. They are evolving and integrating into wider architectural frameworks where AI workloads are intentionally controlled and routed to the environments best suited for their operational and governance needs. 

What IT Teams Should Be Asking 

For practitioners navigating this shift, several questions are critical: 

  • Where do our AI workloads live today, and why? 
  • Are our current environments meeting the latency, compliance and compute needs of those workloads? 
  • Do we have visibility across a hybrid or multicloud environment to monitor performance and security? 
  • Can our network architecture handle the increased demands of distributed AI workloads? 

Repatriation isn’t a simple toggle. It requires rethinking networking, security, observability and cloud orchestration. Many organizations are also facing internal skills gaps, making managed services or outside partnerships an increasingly important part of the equation. 

The Future Must Be Flexible 

AI is redefining infrastructure strategy. The public cloud is not going away and still plays a key role, but it’s no longer the default for every use case. The future belongs to organizations that can adapt by building secure, flexible and workload-aware environments that align with the realities of today’s business objectives. Repatriation is just one sign of that evolution. 

TECHSTRONG TV

Click full-screen to enable volume control
Watch latest episodes and shows

Tech Field Day Events

SHARE THIS STORY