STOCK TITAN

Supermicro Extends AI and GPU Rack Scale Solutions with Support for AMD Instinct MI300 Series Accelerators

Rhea-AI Impact
(Low)
Rhea-AI Sentiment
(Very Positive)
Tags
AI
Rhea-AI Summary
Supermicro, Inc. (SMCI) announces new 8-GPU systems powered by AMD Instinct™ MI300X accelerators, offering breakthrough AI and HPC performance for large-scale AI training and LLM deployments. The new systems include 2U liquid-cooled and 4U air-cooled servers with AMD Instinct MI300A APUs, delivering up to 3.4X performance improvement compared to previous generations. Supermicro's worldwide manufacturing facilities streamline the delivery of these new servers for AI and HPC convergence.
Positive
  • New 8-GPU systems powered by AMD Instinct MI300X accelerators offer breakthrough AI and HPC performance
  • The new systems deliver up to 3.4X performance improvement compared to previous generations
  • Supermicro's worldwide manufacturing facilities streamline the delivery of new servers for AI and HPC convergence
Negative
  • None.

New 8-GPU Systems Powered by AMD Instinct™ MI300X Accelerators Are Now Available with Breakthrough AI and HPC Performance for Large Scale AI Training and LLM Deployments

SAN JOSE, Calif., Dec. 6, 2023 /PRNewswire/ -- Supermicro, Inc. (NASDAQ: SMCI), a Total IT Solution Manufacturer for AI, Cloud, Storage, and 5G/Edge, is announcing three new additions to its AMD-based H13 generation of GPU Servers, optimized to deliver leading-edge performance and efficiency, powered by the new AMD Instinct MI300 Series accelerators. Supermicro's powerful rack scale solutions with 8-GPU servers with the AMD Instinct MI300X OAM configuration are ideal for large model training.

The new 2U liquid-cooled and 4U air-cooled servers with the AMD Instinct MI300A Accelerated Processing Units (APUs) accelerators are available and improve data center efficiencies and power the fast-growing complex demands in AI, LLM, and HPC. The new systems contain quad APUs for scalable applications. Supermicro can deliver complete liquid-cooled racks for large-scale environments with up to 1,728 TFlops of FP64 performance per rack. Supermicro worldwide manufacturing facilities streamline the delivery of these new servers for AI and HPC convergence.

"We are very excited to expand our rack scale Total IT Solutions for AI training with the latest generation of AMD Instinct accelerators, with up to 3.4X the performance improvement compared to previous generations," said Charles Liang, president and CEO of Supermicro. "With our ability to deliver 4,000 liquid-cooled racks per month from our worldwide manufacturing facilities, we can deliver the newest H13 GPU solutions with either the AMD's Instinct MI300X accelerator or the AMD Instinct MI300A APU. Our proven architecture allows 1:1 400G networking dedicated for each GPU designed for large-scale AI and supercomputing clusters capable of fully integrated liquid cooling solutions, giving customers a competitive advantage for performance and superior efficiency with ease of deployment."

Learn more about Supermicro Servers with AMD Accelerators

The LLM optimized AS –8125GS-TNMR2 system is built on Supermicro's building block architecture, a proven design for high-performance AI systems with air and liquid cooled rack scale designs. The balanced system design associates a GPU with a 1:1 networking to provide a large pool of high bandwidth memory across nodes and racks to fit today's largest language models with up to trillions of parameters, maximizing parallel computing and minimizing the training time and inference latency. The 8U system with the MI300X OAM accelerator offers the raw acceleration power of 8-GPU with the AMD Infinity Fabric™ Links, enabling up to 896GB/s of peak theoretical P2P I/O bandwidth on the open standard platform with industry-leading 1.5TB HBM3 GPU memory in a single system, as well as native sparse matrix support, designed to save power, lower compute cycles and reduce memory use for AI workloads. Each server features dual socket AMD EPYC™ 9004 series processors with up to 256 cores. At rack scale, over 1000 CPU cores, 24TB of DDR5 memory, 6.144TB of HBM3 memory, and 9728 Compute Units are available for the most challenging AI environments. Using the OCP Accelerator Module (OAM), with which Supermicro has significant experience in 8U configurations, brings a fully configured server to market faster than a custom design, reducing costs and time to delivery.

Supermicro is also introducing a density optimized 2U liquid-cooled server, the AS –2145GH-TNMR, and a 4U air-cooled server, the AS –4145GH-TNMR, each with 4 AMD Instinct™ MI300A accelerators. The new servers are designed for HPC and AI applications, requiring extremely fast CPU to GPU communication. The APU eliminates redundant memory copies by combining the highest-performing AMD CPU, GPU, and HBM3 memory on a single chip. Each server contains leadership x86 "Zen4" CPU cores for application scale-up. Also, each server includes 512GB of HBM3 memory. In a full rack (48U) solution consisting of 21 2U systems, over 10TB of HBM3 memory is available, as well as 19,152 Compute Units. The HBM3 to CPU memory bandwidth is 5.3 TB/second.

Both systems feature dual AIOMs with 400G Ethernet support and expanded networking options designed to improve space, scalability, and efficiency for high-performance computing. The 2U direct-to-chip liquid-cooled system delivers excellent TCO with over a 35% energy consumption savings based on 21 2U system rack solutions that produce 61,780 watts per rack over 95,256 watts air-cooled rack, as well as a 70% reduction in the number of fans compared to an air cooled system.

"AMD Instinct MI300 Series accelerators deliver leadership performance, both for longstanding accelerated high performance computing applications and for the rapidly growing demand for generative AI," said Forrest Norrod, executive vice president and general manager, Data Center Solutions Business Group, AMD. "We continue to work closely with Supermicro to bring to market leading-edge AI and HPC total solutions based on MI300 Series accelerators and leveraging Supermicro's expertise in system and data center design."

Learn more from Supermicro and AMD experts. View this webinar live or on demand.

For more information, please visit:

Supermicro AMD Accelerator Site

AS -8125GS-TNMR2 (8U w/ MI300x)

AS -2145GH-TNMR (2U LC w/ MI300A) 

AS -4145GH-TNMR (4U AC w/ MI300A) 

About Super Micro Computer, Inc.

Supermicro (NASDAQ: SMCI) is a global leader in Application-Optimized Total IT Solutions. Founded and operating in San Jose, California, Supermicro is committed to delivering first to market innovation for Enterprise, Cloud, AI, and 5G Telco/Edge IT Infrastructure. We are a Total IT Solutions manufacturer with server, AI, storage, IoT, switch systems, software, and support services. Supermicro's motherboard, power, and chassis design expertise further enable our development and production, enabling next generation innovation from cloud to edge for our global customers. Our products are designed and manufactured in-house (in the US, Taiwan, and the Netherlands), leveraging global operations for scale and efficiency and optimized to improve TCO and reduce environmental impact (Green Computing). The award-winning portfolio of Server Building Block Solutions® allows customers to optimize for their exact workload and application by selecting from a broad family of systems built from our flexible and reusable building blocks that support a comprehensive set of form factors, processors, memory, GPUs, storage, networking, power, and cooling solutions (air-conditioned, free air cooling or liquid cooling). 

Supermicro, Server Building Block Solutions, and We Keep IT Green are trademarks and/or registered trademarks of Super Micro Computer, Inc.

AMD, the AMD Arrow logo, AMD Instinct, EPYC and combinations thereof are trademarks of Advanced Micro Devices.

All other brands, names, and trademarks are the property of their respective owners.

Cision View original content to download multimedia:https://www.prnewswire.com/news-releases/supermicro-extends-ai-and-gpu-rack-scale-solutions-with-support-for-amd-instinct-mi300-series-accelerators-302007807.html

SOURCE Super Micro Computer, Inc.

FAQ

What is the latest announcement from Supermicro, Inc. (SMCI)?

Supermicro, Inc. (SMCI) has announced three new additions to its AMD-based H13 generation of GPU Servers, optimized to deliver leading-edge performance and efficiency, powered by the new AMD Instinct MI300 Series accelerators.

What are the key features of the new 8-GPU systems?

The new 8-GPU systems include 2U liquid-cooled and 4U air-cooled servers with AMD Instinct MI300A APUs, delivering up to 3.4X performance improvement compared to previous generations.

How does Supermicro streamline the delivery of the new servers?

Supermicro's worldwide manufacturing facilities streamline the delivery of these new servers for AI and HPC convergence.

What is the performance improvement offered by the new systems?

The new systems offer up to 3.4X performance improvement compared to previous generations.

What is the president and CEO of Supermicro, Inc. (SMCI) saying about the new systems?

Charles Liang, president and CEO of Supermicro, mentioned that the new systems offer up to 3.4X the performance improvement compared to previous generations and can be delivered at scale from their worldwide manufacturing facilities.

What type of accelerators are powering the new 8-GPU systems?

The new 8-GPU systems are powered by the new AMD Instinct MI300X accelerators and AMD Instinct MI300A APUs.

Super Micro Computer, Inc.

NASDAQ:SMCI

SMCI Rankings

SMCI Latest News

SMCI Stock Data

10.14B
585.57M
14.49%
55.52%
17.22%
Computer Hardware
Electronic Computers
Link
United States of America
SAN JOSE