Rackspace Technology and AMD have signed a Memorandum of Understanding (MOU) to create a new category of managed enterprise AI infrastructure. The core idea: instead of renting generic GPU capacity by the hour and handling integration, security, and accountability yourself, you get a fully managed, governed stack built on dedicated AMD Instinct GPUs and EPYC CPUs. Rackspace would own the entire stack, from silicon to outcomes, with defined SLAs and accountability.
What it is
The proposed offering is called the Enterprise AI Cloud. It is a fully managed, private and hybrid AI environment where Rackspace assembles, integrates, and operates the full stack — accelerated compute, AI inference, and agents in production — for enterprises that require sovereignty, compliance, and operational accountability. The MOU framework includes four integrated capabilities:
- Enterprise AI Cloud: A private/hybrid environment built on AMD Instinct GPUs and EPYC CPUs, operated under Rackspace's governed model.
- Enterprise Inference Engine: A context-aware inference runtime that retains domain knowledge, session history, and enterprise-specific data context across queries. Rackspace would own the SLA for availability, scaling, and performance.
- Inference as a Service: Dedicated, managed AMD Instinct GPUs with developer-ready inferencing and fine-tuning toolkits. The customer brings their own model and engineering team; Rackspace provides reliable bare metal capacity with operational discipline and performance SLOs.
- Bare Metal AMD Instinct: Dedicated, high-performance bare metal AMD Instinct compute for customers requiring physical isolation, deterministic performance, and direct hardware access for demanding training and inference workloads.
Why it matters
Today's dominant model requires enterprises to rent GPU capacity by the hour and carry the operational burden themselves — integration, security, and accountability. This collaboration proposes to invert that model. The aim is to give enterprises a single operator accountable for every layer, calibrated to the sovereignty, performance, and compliance requirements of each workload.
Caveats
The MOU is a framework for potential collaboration, not a binding commitment. No definitive agreements have been reached, and discussions remain preliminary. There is no assurance that any such arrangements will be entered into or that the anticipated benefits will be realized. Any third-party financing required is subject to availability on acceptable terms.
Bottom line
Rackspace and AMD are proposing a governed alternative to commodity GPU rental for regulated enterprises and sovereign workloads. If executed, it would give enterprises a single operator accountable for the full AI stack, from silicon to outcomes, with defined SLAs and compliance built in from the start. The key question is whether the MOU will translate into a binding commercial agreement.