Micron Ships HBM4 to Key Customers to Power Next-Gen AI Platforms
June 11, 2025 | MicronEstimated reading time: 2 minutes
The importance of high-performance memory has never been greater, fueled by its crucial role in supporting the growing demands of AI training and inference workloads in data centers. Micron Technology, Inc., announced the shipment of HBM4 36GB 12-high samples to multiple key customers. This milestone extends Micron’s leadership in memory performance and power efficiency for AI applications. Built on its well-established 1ß (1-beta) DRAM process, proven 12-high advanced packaging technology and highly capable memory built-in self-test (MBIST) feature, Micron HBM4 provides seamless integration for customers and partners developing next-generation AI platforms.
A leap forward
As use of generative AI continues to grow, the ability to effectively manage inference becomes more important. Micron HBM4 features a 2048-bit interface, achieving speeds greater than 2.0 TB/s per memory stack and more than 60% better performance over the previous generation.1 This expanded interface facilitates rapid communication and a high-throughput design that accelerates the inference performance of large language models and chain-of-thought reasoning systems. Simply put, HBM4 will help AI accelerators respond faster and reason more effectively.
Additionally, Micron HBM4 features over 20% better power efficiency compared to Micron’s previous-generation HBM3E products, which first established new, unrivaled benchmarks in HBM power efficiency in the industry.2 This improvement provides maximum throughput with the lowest power consumption to maximize data center efficiency.2
Generative AI use cases continue to multiply, and this transformative technology is poised to deliver significant benefits to society. HBM4 is a crucial enabler, driving quicker insights and discoveries that will foster innovation in diverse fields such as healthcare, finance and transportation.
"Micron HBM4’s performance, higher bandwidth and industry-leading power efficiency are a testament to our memory technology and product leadership," said Raj Narasimhan, senior vice president and general manager of Micron’s Cloud Memory Business Unit. "Building on the remarkable milestones achieved with our HBM3E deployment, we continue to drive innovation with HBM4 and our robust portfolio of AI memory and storage solutions. Our HBM4 production milestones are aligned with our customers’ next-generation AI platform readiness to ensure seamless integration and volume ramp."
Intelligence Accelerated: Micron’s role in the AI revolution
For nearly five decades, Micron has pushed the boundaries of memory and storage innovation. Today, Micron continues to accelerate AI by delivering a broad portfolio of solutions that turn data into intelligence, fueling breakthroughs from the data center to the edge. With HBM4, Micron reinforces its position as a critical catalyst for AI innovation and a reliable partner for our customers’ most demanding solutions.
Suggested Items
HBM4 Raises the Bar on Manufacturing Complexity, Premium Expected to Exceed 30%
05/22/2025 | TrendForceTrendForce's latest findings reveal that demand for AI servers continues to accelerate the development of HBM technologies, with the three major suppliers actively advancing their HBM4 product roadmaps.
Cadence Enables Next-Gen AI and HPC Systems with Industry’s Fastest HBM4 12.8Gbps IP Memory System Solution
04/21/2025 | Cadence Design SystemsCadence announced the industry’s fastest HBM4 12.8Gbps memory IP solution, which meets the increasingly higher memory bandwidth needs of SoCs targeted for the next generation of AI training and HPC hardware systems.
SK hynix Leads the Market with HBM3e 16hi Products, Boosting Capacity Limits
11/14/2024 | TrendForceSK hynix recently unveiled its development of HBM3e 16hi memory at the SK AI Summit 2024, featuring a 48 GB capacity per cube, with sampling scheduled for the first half of 2025.
JEDEC Approaches Finalization of HBM4 Standard, Eyes Future Innovations
07/15/2024 | JEDECJEDEC Solid State Technology Association, the global leader in the development of standards for the microelectronics industry, announced it is nearing completion of the next version of its highly anticipated High Bandwidth Memory (HBM) DRAM standard: HBM4.
Manufacturers Anticipate Completion of NVIDIA’s HBM3e Verification by 1Q24; HBM4 Expected to Launch in 2026
11/28/2023 | TrendForceTrendForce’s latest research into the HBM market indicates that NVIDIA plans to diversify its HBM suppliers for more robust and efficient supply chain management. Samsung’s HBM3 (24GB) is anticipated to complete verification with NVIDIA by December this year.