Micron First to Ship Critical Memory for AI Data Centers
May 1, 2024 | MicronEstimated reading time: 3 minutes
Micron Technology, Inc. announced it is leading the industry by validating and shipping its high-capacity monolithic 32Gb DRAM die-based 128GB DDR5 RDIMM memory in speeds up to 5,600 MT/s on all leading server platforms. Powered by Micron’s industry-leading 1β (1-beta) technology, the 128GB DDR5 RDIMM memory delivers more than 45% improved bit density,1 up to 22% improved energy efficiency2 and up to 16% lower latency1 over competitive 3DS through-silicon via (TSV) products.
Micron’s collaboration with industry leaders and customers has yielded broad adoption of these new high-performance, large-capacity modules across high-volume server CPUs. These high-speed memory modules were engineered to meet the performance needs of a wide range of mission-critical applications in data centers, including artificial intelligence (AI) and machine learning (ML), high-performance computing (HPC), in-memory databases (IMDBs) and efficient processing for multithreaded, multicore count general compute workloads. Micron’s 128GB DDR5 RDIMM memory will be supported by a robust ecosystem including AMD, Hewlett Packard Enterprise (HPE), Intel, Supermicro, along with many others.
“With this latest volume shipment milestone, Micron continues to lead the market in providing high-capacity RDIMMs that have been qualified on all the major CPU platforms to our customers,” said Praveen Vaidyanathan, vice president and general manager of Micron’s Compute Products Group. “AI servers will now be configured with Micron’s 24GB 8-high HBM3E for GPU-attached memory and Micron’s 128GB RDIMMs for CPU-attached memory to deliver the capacity, bandwidth and power-optimized infrastructure required for memory intensive workloads.”
Industry Adoption
“A core tenet of our work with Micron is advancing the capabilities of data center infrastructure through highly-performant memory for compute intensive workloads,” said Dan McNamara, senior vice president and general manager, Server Business Unit, AMD. “Through this collaboration, our joint customers can now get immediate impact out of the high-capacity DDR5 memory offering from Micron in an AMD EPYC CPU powered server, delivering the performance and efficiency needed for the modern data center.”
“Adopting advanced memory capabilities, while ensuring high-performance and efficiency, is critical to supporting growing AI workloads in training, tuning, and inferencing,” Krista Satterthwaite, senior vice president and general manager, Compute at HPE. “We are committed to providing the most high-performing, energy-efficient solutions, and through our collaboration with Micron, plan to deliver monolithic, high-density DRAM across our AI portfolio to help our enterprise customers gain optimal performance to tackle any workload.”
“Micron’s 128GB DDR5 RDIMM memory is the first 32Gb monolithic DRAM-based high-capacity DIMM that has completed Intel platform memory compatibility qualification on 4th and 5th Gen Intel® Xeon® processors,” said Dr. Dimitrios Ziakas, vice president of Intel’s Memory and IO Technologies, Intel Corporation. “32Gb density based DDR5 DIMMs accelerates critical server and AI system configurations bringing forward key performance, capacity, and most importantly power efficiency benefits to Intel® Xeon® processor-based systems. We are excited to continue our collaboration with Micron to drive broad adoption of innovative products in the market that solve memory capacity and power bottlenecks for AI and server customers.”
“Supermicro is leading the industry with the broadest accelerated server and solution portfolio based on NVIDIA, AMD and Intel,” said Wally Liaw, senior vice president of Business Development and co-founder at Supermicro. “Savvy customers are looking for large memory footprint, performance, and efficiency improvements in the AI infrastructure. Customers can benefit significantly from Supermicro’s advanced GPU SuperServers with the new 32Gb monolithic DRAM-based 128GB memory, and we are excited to collaborate with Micron to enable this.”
Micron 128GB DDR5 RDIMM memory is available now directly from Micron and will be available through select global channel distributors and resellers in June 2024. As part of its comprehensive data center memory portfolio, Micron offers a wide array of memory options across DDR5 RDIMMs, MCRDIMMs, MRDIMMs, CXL and LPDDR5x form factors to allow customers to integrate optimized solutions for AI and high-performance computing (HPC) applications that suit their needs for bandwidth, capacity and power optimization.
Testimonial
"In a year when every marketing dollar mattered, I chose to keep I-Connect007 in our 2025 plan. Their commitment to high-quality, insightful content aligns with Koh Young’s values and helps readers navigate a changing industry. "
Brent Fischthal - Koh YoungSuggested Items
Cadence Introduces Industry-First LPDDR6/5X 14.4Gbps Memory IP to Power Next-Generation AI Infrastructure
07/10/2025 | Cadence Design SystemsCadence announced the tapeout of the industry’s first LPDDR6/5X memory IP system solution optimized to operate at 14.4Gbps, up to 50% faster than the previous generation of LPDDR DRAM.
NVIDIA RTX PRO 6000 Shipments Expected to Rise Amid Market Uncertainties
06/24/2025 | TrendForceThe NVIDIA RTX PRO 6000 has recently generated significant buzz in the market, with expectations running high for strong shipment performance driven by solid demand.
President Trump Secures $200B Investment from Micron Technology for Memory Chip Manufacturing in the United States
06/16/2025 | U.S. Department of CommerceThe Department of Commerce announced that Micron Technology, Inc., the leading American semiconductor memory company, plans to invest $200 billion in semiconductor manufacturing and R&D to dramatically expand American memory chip production.
Micron Ships HBM4 to Key Customers to Power Next-Gen AI Platforms
06/11/2025 | MicronThe importance of high-performance memory has never been greater, fueled by its crucial role in supporting the growing demands of AI training and inference workloads in data centers. Micron Technology, Inc., announced the shipment of HBM4 36GB 12-high samples to multiple key customers.
NVIDIA Expected to Launch RTX PRO 6000 Special Edition for China’s AI Market, Potentially Boosting Future GDDR7 Demand
05/28/2025 | TrendForceTrendForce reports that following the new U.S. export restrictions announced in April—which require additional permits for the export of NVIDIA’s H20 or any chips with equivalent memory bandwidth or interconnect performance to China—NVIDIA is expected to release a special low-power, downscaled version of the RTX PRO 6000 (formerly B40) for the Chinese market.