At the SK AI Summit 2024, SK hynix CEO Kwak Noh-Jung unveiled the world's first 16-high 48GB HBM3E memory solution, pushing AI memory capabilities to unprecedented levels. The advanced HBM3E solution promises enhanced performance in AI accelerators, with initial samples ready by early 2025.
SK hynix Unveils New 16-Hi HBM3E Memory Solution
At the SK AI Summit 2024, CEO Kwak Noh-Jung announced the 16-Hi HBM3E memory. During the event, Kwak presented the 16-Hi HBM3E solution.
Providing 48 GB capacity samples, which is the industry standard for HBM products and has the most layers. It is anticipated that the initial samples of this memory expansion technology would be distributed in early 2025.
Highest Capacity and Layer Count in the Industry
While speaking at the SK AI Summit in Seoul, CEO Kwak Noh-Jung unveiled the company's first 48GB 16-high product, the HBM3E, which had 12 layers, and the industry's first 48GB 16-high product, the industry record.
Here is a rundown of what Mr. Kwak had to say, according to WCCFTECH:
- Although the 16-high HBM market is anticipated to unlock with the HBM4 generation, SK hynix is preparing to offer 48GB 16-high HBM3E samples to customers early next year in an effort to achieve technological stability.
- SK hynix developed hybrid bonding technology as a backup and used the Advanced MR-MUF method, which allowed for mass manufacture of 12-high products, to create 16-high HBM3E.
- Products with 16 stars have a 32% improvement in inference performance and an 18% improvement in training performance compared to products with 12 stars. The 16 high-quality solutions are anticipated to contribute to the company's continued leadership in AI memory as the market for AI inference accelerators is anticipated to grow.
- SK hynix is capitalizing on its low-power and high-performance product competitiveness by creating an LPCAMM2 module for PCs and data centers, as well as 1cnm-based LPDDR5 and LPDDR6.
- As for other products, the business is getting ready to release UFS5.0, high-capacity QLC-based eSSD, and PCIe 6th generation SSD.
- In order to offer consumers the finest products, SK hynix is planning to implement a logic process on the base die from the HBM4 generation in partnership with a leading global logic foundry.
- In anticipation of a paradigm shift in artificial intelligence memory, customized HBM will be a high-performance solution that caters to specific user needs in terms of capacity, bandwidth, and functionality.
- To get around this "memory wall," SK hynix is working on a solution that augments memory with computational capabilities. A formidable obstacle that will alter the foundation of AI systems of the future and the course of the AI industry as a whole are technologies like Computational Storage, Processing in Memory (PIM), and Processing Near Memory (PNM), which are crucial for handling massive data sets in the future.


United Airlines Cuts Flights 5% Amid Soaring Fuel Costs From Iran War
Microsoft Eyes Legal Action as Amazon-OpenAI Deal Threatens Azure Exclusivity
Elliott Investment Management Takes Activist Stake in Align Technology
Micron Technology Beats Q2 Earnings Estimates, Issues Strong AI-Driven Outlook
Apple Defies China's Smartphone Slump with Strong Early 2026 Sales
xAI Faces Lawsuit Over Grok AI-Generated Sexual Content Involving Minors
Volkswagen CEO Urges Germany to Adopt China's Industrial Discipline Amid Major Restructuring
Cyberattack on Stryker Triggers U.S. Government Warning Over Microsoft Intune Security
Xiaomi Shares Drop After SU7 Launch as Margin Concerns Weigh on Investors
Nvidia's Jensen Huang Forecasts $1 Trillion in AI Chip Demand Through 2027
Amazon's "Transformer" Phone: Can It Succeed Where Fire Phone Failed?
Xiaomi's AI Model "Hunter Alpha" Mistaken for DeepSeek's Next Release
Palantir's Maven AI Earns Pentagon "Program of Record" Status, Reshaping Military AI Strategy
Amazon's AWS Could Hit $600 Billion in Revenue as AI Reshapes Cloud Growth
Virgin Australia Adjusts Fares Amid Rising Aviation Costs and Middle East Tensions
FCC Approves $3.54B Nexstar-Tegna Merger, Waiving Broadcast Ownership Cap
Elon Musk Confirms SpaceX, xAI, and Tesla Will Continue Large-Scale Nvidia Chip Orders 



