OCI Chooses AMD MI300X Accelerators for AI Supercluster

News Desk -

Share

AMD (NASDAQ: AMD) has announced that Oracle Cloud Infrastructure (OCI) has selected AMD Instinct™ MI300X accelerators and ROCm™ open software for its latest OCI Compute Supercluster instance, BM.GPU.MI300X.8. This state-of-the-art supercluster is designed to handle the most demanding AI workloads, including large language model (LLM) inference and training, supporting up to 16,384 GPUs in a single cluster.

The OCI Supercluster leverages the same ultrafast network fabric technology found in other OCI accelerators, providing exceptional performance and efficiency for AI models with hundreds of billions of parameters. This advanced setup offers high throughput, superior memory capacity, and bandwidth, meeting the needs of today’s data-intensive AI applications.

Andrew Dieckmann, Corporate Vice President and General Manager of AMD’s Data Center GPU Business, highlighted the growing adoption of AMD Instinct MI300X and ROCm software. “Our solutions continue to prove themselves as reliable for powering critical OCI AI workloads. The expansion of these technologies into the AI market benefits OCI customers with enhanced performance, efficiency, and system design flexibility.”

Donald Lu, Senior Vice President of Software Development at Oracle Cloud Infrastructure, emphasized the advantages of integrating AMD Instinct MI300X accelerators into OCI’s infrastructure. “The inference capabilities of these accelerators contribute to OCI’s broad range of high-performance bare metal instances, eliminating the overhead of virtualized compute typically used for AI tasks. We are excited to provide customers with more options for accelerating AI workloads affordably.”

The AMD Instinct MI300X has undergone rigorous testing, demonstrating its capabilities in AI inferencing and training, even with larger batch sizes and extensive LLM models. Fireworks AI, a leading platform for generative AI, has already adopted these instances, benefiting from their high performance.

Lin Qiao, CEO of Fireworks AI, stated, “Fireworks AI leverages OCI and AMD Instinct MI300X to deliver rapid AI deployment across various industries. The memory capacity and performance of these accelerators allow us to scale our services effectively as AI models continue to evolve.”

With this collaboration, OCI and AMD are setting new benchmarks in AI infrastructure, delivering cutting-edge performance and open choices for developers and enterprises worldwide.


Leave a reply