STOCK TITAN

Micron Innovates From the Data Center to the Edge With NVIDIA

Rhea-AI Impact
(Low)
Rhea-AI Sentiment
(Very Positive)
Tags

Micron Technology (MU) announces its position as the world's first and only memory company shipping both HBM3E and SOCAMM products for AI servers. The company has developed SOCAMM, a modular LPDDR5X memory solution, in collaboration with NVIDIA for their GB300 Grace Blackwell Ultra Superchip.

The SOCAMM solution, now in volume production, offers: 2.5x higher bandwidth than RDIMMs, occupies one-third of standard RDIMM size, consumes one-third power compared to DDR5 RDIMMs, and provides 128GB capacity with four 16-die stacks.

Micron's HBM3E 12H 36GB delivers 50% increased capacity over HBM3E 8H 24GB and 20% lower power consumption compared to competitors. The company's upcoming HBM4 solution is expected to boost performance by over 50% compared to HBM3E.

Micron Technology (MU) annuncia la sua posizione come l'unica azienda al mondo a spedire sia prodotti HBM3E che SOCAMM per server AI. L'azienda ha sviluppato SOCAMM, una soluzione di memoria modulare LPDDR5X, in collaborazione con NVIDIA per il loro superchip GB300 Grace Blackwell Ultra.

La soluzione SOCAMM, ora in produzione di massa, offre: una larghezza di banda 2,5 volte superiore rispetto agli RDIMM, occupa un terzo delle dimensioni degli RDIMM standard, consuma un terzo dell'energia rispetto agli RDIMM DDR5 e fornisce una capacità di 128GB con quattro stack da 16 die.

Il HBM3E 12H 36GB di Micron offre una capacità aumentata del 50% rispetto all'HBM3E 8H 24GB e un consumo energetico inferiore del 20% rispetto ai concorrenti. Si prevede che la prossima soluzione HBM4 dell'azienda aumenterà le prestazioni di oltre il 50% rispetto all'HBM3E.

Micron Technology (MU) anuncia su posición como la primera y única empresa de memoria en el mundo que envía productos HBM3E y SOCAMM para servidores de IA. La compañía ha desarrollado SOCAMM, una solución de memoria modular LPDDR5X, en colaboración con NVIDIA para su superchip GB300 Grace Blackwell Ultra.

La solución SOCAMM, que ahora está en producción en volumen, ofrece: un ancho de banda 2.5 veces mayor que los RDIMM, ocupa un tercio del tamaño estándar de los RDIMM, consume un tercio de energía en comparación con los RDIMM DDR5 y proporciona una capacidad de 128GB con cuatro pilas de 16 chips.

El HBM3E 12H 36GB de Micron ofrece un aumento del 50% en capacidad en comparación con el HBM3E 8H 24GB y un consumo de energía un 20% menor en comparación con los competidores. Se espera que la próxima solución HBM4 de la compañía aumente el rendimiento en más del 50% en comparación con el HBM3E.

마이크론 테크놀로지 (MU)는 AI 서버를 위한 HBM3E 및 SOCAMM 제품을 모두 배송하는 세계 최초이자 유일한 메모리 회사임을 발표했습니다. 이 회사는 NVIDIA와 협력하여 GB300 Grace Blackwell Ultra 슈퍼칩을 위한 모듈형 LPDDR5X 메모리 솔루션인 SOCAMM을 개발했습니다.

현재 대량 생산 중인 SOCAMM 솔루션은 RDIMM보다 2.5배 높은 대역폭을 제공하고, 표준 RDIMM 크기의 3분의 1을 차지하며, DDR5 RDIMM에 비해 3분의 1의 전력을 소모하고, 16개의 다이 스택 4개로 128GB 용량을 제공합니다.

마이크론의 HBM3E 12H 36GB는 HBM3E 8H 24GB에 비해 50% 증가한 용량을 제공하며, 경쟁사에 비해 20% 낮은 전력 소비를 자랑합니다. 회사의 차기 HBM4 솔루션은 HBM3E에 비해 50% 이상의 성능 향상이 기대됩니다.

Micron Technology (MU) annonce sa position en tant que première et unique entreprise de mémoire au monde à expédier des produits HBM3E et SOCAMM pour serveurs AI. L'entreprise a développé SOCAMM, une solution de mémoire LPDDR5X modulaire, en collaboration avec NVIDIA pour leur superchip GB300 Grace Blackwell Ultra.

La solution SOCAMM, maintenant en production de masse, offre : une bande passante 2,5 fois supérieure à celle des RDIMM, occupe un tiers de la taille standard des RDIMM, consomme un tiers de l'énergie par rapport aux RDIMM DDR5 et fournit une capacité de 128 Go avec quatre empilements de 16 puces.

Le HBM3E 12H 36 Go de Micron offre une capacité augmentée de 50 % par rapport à l'HBM3E 8H 24 Go et une consommation d'énergie 20 % inférieure par rapport aux concurrents. La prochaine solution HBM4 de l'entreprise devrait augmenter les performances de plus de 50 % par rapport à l'HBM3E.

Micron Technology (MU) kündigt seine Position als das weltweit erste und einzige Speicherunternehmen an, das sowohl HBM3E- als auch SOCAMM-Produkte für KI-Server versendet. Das Unternehmen hat SOCAMM, eine modulare LPDDR5X-Speicherlösung, in Zusammenarbeit mit NVIDIA für ihren GB300 Grace Blackwell Ultra Superchip entwickelt.

Die SOCAMM-Lösung, die sich nun in der Serienproduktion befindet, bietet: eine 2,5-fache höhere Bandbreite als RDIMMs, nimmt ein Drittel der Standardgröße von RDIMMs ein, verbraucht ein Drittel der Energie im Vergleich zu DDR5 RDIMMs und bietet eine Kapazität von 128GB mit vier 16-Dice-Stapeln.

Microns HBM3E 12H 36GB bietet eine um 50% erhöhte Kapazität im Vergleich zu HBM3E 8H 24GB und einen um 20% geringeren Energieverbrauch im Vergleich zu Wettbewerbern. Die kommende HBM4-Lösung des Unternehmens wird voraussichtlich die Leistung um über 50% im Vergleich zu HBM3E steigern.

Positive
  • First and only company shipping both HBM3E and SOCAMM products
  • SOCAMM offers 2.5x higher bandwidth and 66% power savings vs traditional solutions
  • HBM3E 12H provides 50% higher capacity and 20% lower power consumption vs competition
  • Volume production achieved for SOCAMM solution
Negative
  • None.

Insights

Micron's announcement confirms its strategic position as a critical supplier in the AI computing infrastructure chain, particularly through its exclusive status as the only memory provider shipping both HBM3E and SOCAMM products. This reinforces Micron's competitive edge in the high-margin, high-growth AI memory space where supply constraints have been prevalent.

The deep integration with NVIDIA's newest platforms (HGX B300, GB300, HGX B200, GB200) signals Micron's strong positioning with the dominant AI chip leader. This relationship should translate to substantial revenue streams as NVIDIA's next-generation AI systems deploy throughout data centers globally.

What's most significant is Micron's technical leadership with both HBM3E 12H 36GB offering 50% higher capacity than previous generations while delivering 20% lower power consumption versus competitors. The new SOCAMM solution's superior metrics (2.5x higher bandwidth, 1/3 power consumption) address the critical AI compute bottlenecks of memory bandwidth and power efficiency.

With memory being the essential foundation for AI processing capabilities, Micron has positioned itself at the heart of the AI infrastructure boom, potentially securing premium pricing power through technological differentiation rather than competing solely on commodity terms.

The technical specifications of Micron's new memory solutions represent meaningful advancement in addressing the memory wall challenges facing AI deployments. The SOCAMM innovation delivers four important technical advantages that directly impact AI performance metrics:

First, the 2.5x bandwidth improvement over RDIMMs directly enhances neural network training throughput and model inference speed - critical factors that determine competitive advantage in AI deployment economics.

Second, the radical 67% power reduction versus standard DDR5 addresses one of the most pressing issues in AI infrastructure: thermal constraints and operating costs. This power efficiency multiplies across thousands of nodes in hyperscale deployments.

Third, the 128GB capacity in the compact SOCAMM form factor enables more comprehensive models with larger parameter counts per server node, critical for next-generation foundation models.

Finally, Micron's extension of this technology from data centers to edge devices through automotive-grade LPDDR5X solutions creates a unified memory architecture that simplifies AI deployment across computing environments.

These advancements position Micron to capture value throughout the entire AI computing stack rather than just in specialized applications.

Micron HBM3E 12H and LPDDR5X-based SOCAMM solutions designed to unlock full potential of AI platforms

SAN JOSE, Calif., March 18, 2025 (GLOBE NEWSWIRE) -- GTC 2025 -- Secular growth of AI is built on the foundation of high-performance, high-bandwidth memory solutions. These high-performing memory solutions are critical to unlock the capabilities of GPUs and processors. Micron Technology, Inc. (Nasdaq: MU), today announced it is the world’s first and only memory company shipping both HBM3E and SOCAMM (small outline compression attached memory module) products for AI servers in the data center. This extends Micron’s industry leadership in designing and delivering low-power DDR (LPDDR) for data center applications.

Micron’s SOCAMM, a modular LPDDR5X memory solution, was developed in collaboration with NVIDIA to support the NVIDIA GB300 Grace Blackwell Ultra Superchip. The Micron HBM3E 12H 36GB is also designed into the NVIDIA HGX B300 NVL16 and GB300 NVL72 platforms, while the HBM3E 8H 24GB is available for the NVIDIA HGX B200 and GB200 NVL72 platforms. The deployment of Micron HBM3E products in NVIDIA Hopper and NVIDIA Blackwell systems underscores Micron’s critical role in accelerating AI workloads.

Think AI, think memory, think Micron
At GTC 2025, Micron will showcase its complete AI memory and storage portfolio to fuel AI from the data center to the edge, highlighting the deep alignment between Micron and its ecosystem partners. Micron’s broad portfolio includes HBM3E 8H 24GB and HBM3E 12H 36GB, LPDDR5X SOCAMMs, GDDR7 and high-capacity DDR5 RDIMMs and MRDIMMs. Additionally, Micron offers an industry-leading portfolio of data center SSDs and automotive and industrial products such as UFS4.1, NVMe® SSDs and LPDDR5X, all of which are suited for edge compute applications.

“AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron’s contributions to the NVIDIA Grace Blackwell platform yields significant performance and power-saving benefits for AI training and inference applications,” said Raj Narasimhan, senior vice president and general manager of Micron’s Compute and Networking Business Unit. “HBM and LP memory solutions help unlock improved computational capabilities for GPUs.”

SOCAMM: a new standard for AI memory performance and efficiency
Micron’s SOCAMM solution is now in volume production. The modular SOCAMM solution enables accelerated data processing, superior performance, unmatched power efficiency and enhanced serviceability to provide high-capacity memory for increasing AI workload requirements.

Micron SOCAMM is the world's fastest, smallest, lowest-power and highest capacity modular memory solution,1 designed to meet the demands of AI servers and data-intensive applications. This new SOCAMM solution enables data centers to get the same compute capacity with better bandwidth, improved power consumption and scaling capabilities to provide infrastructure flexibility.

  • Fastest: SOCAMMs provide over 2.5 times higher bandwidth at the same capacity when compared to RDIMMs, allowing faster access to larger training datasets and more complex models, as well as increasing throughput for inference workloads.2
  • Smallest: At 14x90mm, the innovative SOCAMM form factor occupies one-third of the size of the industry-standard RDIMM form factor, enabling compact, efficient server design.3
  • Lowest power: Leveraging LPDDR5X memory, SOCAMM products consume one-third the power compared to standard DDR5 RDIMMs, inflecting the power performance curve in AI architectures.4
  • Highest capacity: SOCAMM solutions use four placements of 16-die stacks of LPDDR5X memory to enable a 128GB memory module, offering the highest capacity LPDDR5X memory solution, which is essential for advancements towards faster AI model training and increased concurrent users for inference workloads.  
  • Optimized scalability and serviceability: SOCAMM’s modular design and innovative stacking technology improve serviceability and aid the design of liquid-cooled servers. The enhanced error correction feature in Micron’s LPDDR5X with data center-focused test flows, provides an optimized memory solution designed for the data center.

Industry-leading HBM solutions
Micron continues its competitive lead in the AI industry by offering 50% increased capacity over the HBM3E 8H 24GB within the same cube form factor.5 Additionally, the HBM3E12H 36GB provides up to 20% lower power consumption compared to the competition's HBM3E 8H 24GB offering, while providing 50% higher memory capacity.6

By continuing to deliver exceptional power and performance metrics, Micron aims to maintain its technology momentum as a leading AI memory solutions provider through the launch of HBM4. Micron’s HBM4 solution is expected to boost performance by over 50% compared to HBM3E.7

Complete memory and storage solutions designed for AI from the data center to the edge
Micron also has a proven portfolio of storage products designed to meet the growing demands of AI workloads. Advancing storage technology in performance and power efficiency at the speed of light requires tight collaboration with ecosystem partners to ensure interoperability and a seamless customer experience. Micron delivers optimized SSDs for AI workloads such as: inference, training, data preparation, analytics and data lakes. Micron will be showcasing the following storage solutions at GTC:

  • High-performance Micron 9550 NVMe and Micron 7450 NVMe SSDs included on the GB200 NVL72 recommended vendor list.
  • Micron’s PCIe Gen6 SSD, demonstrating over 27GB/s of bandwidth in successful interoperability testing with leading PCIe switch and retimer vendors, driving the industry to this new generation of flash storage.
  • Storing more data in less space is essential to get the most out of AI data centers. The Micron 61.44TB 6550 ION NVMe SSD is the drive of choice for bleeding-edge AI cluster exascale storage solutions, by delivering over 44 petabytes of storage per rack,8 14GB/s and 2 million IOPs per drive inside a 20-watt footprint.

As AI and generative AI expand and are integrated on-device at the edge, Micron is working closely with key ecosystem partners to deliver innovative solutions for AI for automotive, industrial and consumer. In addition to high performance requirements, these applications require enhanced quality, reliability and longevity requirements for application usage models.

  • One example of this type of ecosystem collaboration is the integration of Micron LPDDR5X on the NVIDIA DRIVE AGX Orin platform. This combined solution provides increased processing performance and bandwidth while also reducing power consumption.
  • By utilizing Micron’s 1β (1-beta) DRAM node, LPDDR5X memory meets automotive and industrial requirements and offers higher speeds up to 9.6 Gbps and increased capacities from 32Gb to 128Gb to support higher bandwidth.
  • Additionally, Micron LPDDR5X automotive products support operating environments from -40 degrees Celsius up to 125 degrees Celsius to provide a wide temperature range that meets automotive quality and standards.

Micron will exhibit its full data center memory and storage product portfolio at GTC, March 17 – 21, in booth #541.

Think AI, think memory, think Micron: Think AI, think memory, think Micron

A Media Snippet accompanying this announcement is available by clicking on this link.

Additional Resources: 

About Micron Technology, Inc.
Micron Technology, Inc. is an industry leader in innovative memory and storage solutions, transforming how the world uses information to enrich life for all. With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence (AI) and compute-intensive applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more about Micron Technology, Inc. (Nasdaq: MU), visit micron.com. 

© 2025 Micron Technology, Inc. All rights reserved. Information, products, and/or specifications are subject to change without notice. Micron, the Micron logo, and all other Micron trademarks are the property of Micron Technology, Inc. All other trademarks are the property of their respective owners.

Micron Media Relations Contact 
  Kelly Sasso 
  Micron Technology, Inc. 
  +1 (208) 340-2410 
  ksasso@micron.com

___________________
1 Calculations based on comparing one 64GB 128-bit bus SOCAMM to two 32GB 64-bit bus RDIMMs.

2 Calculated using transfer speeds comparing 64GB 2R 8533MT/s SOCAMM and 64GB 2Rx4 6400MT/s RDIMMs. ​

3 Calculated area between one SOCAMM and one RDIMM​.

4 Calculated based on power used in watts by one 128GB, 128-bit bus width SOCAMM compared to two 128GB, 128-bit bus width DDR5 RDIMMs​.

5 Comparison based on HBM3E 36GB capacity versus HBM3E 24GB capacity when both are at the 12x10mm package size.

6 Based on internal calculations, and customer testing and feedback for Micron HBM3E versus the competition’s HBM3E offerings.

7 Calculated bandwidth by comparing HBM4 and HBM3E specifications.

8 Assumes 20x 61.44TB E3.S SSDs in a 1U server with 20x E3.S slots available for storage and that 36 rack units are available for the servers in each rack.


FAQ

What are the key advantages of Micron's new SOCAMM memory solution for AI servers?

SOCAMM offers 2.5x higher bandwidth than RDIMMs, 1/3 smaller size, 1/3 lower power consumption vs DDR5 RDIMMs, and 128GB capacity with enhanced error correction.

How does Micron's HBM3E 12H 36GB performance compare to competitors?

It provides 50% higher capacity than HBM3E 8H 24GB and 20% lower power consumption compared to competitors' HBM3E 8H 24GB offerings.

What NVIDIA platforms will use Micron's new memory solutions?

HBM3E 12H 36GB is designed for NVIDIA HGX B300 NVL16 and GB300 NVL72, while HBM3E 8H 24GB is for NVIDIA HGX B200 and GB200 NVL72 platforms.

What performance improvement is expected from Micron's upcoming HBM4 solution?

Micron's HBM4 solution is expected to deliver over 50% performance improvement compared to HBM3E.
Micron Technology Inc

NASDAQ:MU

MU Rankings

MU Latest News

MU Stock Data

79.37B
1.11B
0.27%
83.23%
2.87%
Semiconductors
Semiconductors & Related Devices
Link
United States
BOISE