At the SK AI Summit 2024, SK hynix CEO Kwak Noh-Jung unveiled the world's first 16-high 48GB HBM3E memory solution, pushing AI memory capabilities to unprecedented levels. The advanced HBM3E solution promises enhanced performance in AI accelerators, with initial samples ready by early 2025.
SK hynix Unveils New 16-Hi HBM3E Memory Solution
At the SK AI Summit 2024, CEO Kwak Noh-Jung announced the 16-Hi HBM3E memory. During the event, Kwak presented the 16-Hi HBM3E solution.
Providing 48 GB capacity samples, which is the industry standard for HBM products and has the most layers. It is anticipated that the initial samples of this memory expansion technology would be distributed in early 2025.
Highest Capacity and Layer Count in the Industry
While speaking at the SK AI Summit in Seoul, CEO Kwak Noh-Jung unveiled the company's first 48GB 16-high product, the HBM3E, which had 12 layers, and the industry's first 48GB 16-high product, the industry record.
Here is a rundown of what Mr. Kwak had to say, according to WCCFTECH:
- Although the 16-high HBM market is anticipated to unlock with the HBM4 generation, SK hynix is preparing to offer 48GB 16-high HBM3E samples to customers early next year in an effort to achieve technological stability.
- SK hynix developed hybrid bonding technology as a backup and used the Advanced MR-MUF method, which allowed for mass manufacture of 12-high products, to create 16-high HBM3E.
- Products with 16 stars have a 32% improvement in inference performance and an 18% improvement in training performance compared to products with 12 stars. The 16 high-quality solutions are anticipated to contribute to the company's continued leadership in AI memory as the market for AI inference accelerators is anticipated to grow.
- SK hynix is capitalizing on its low-power and high-performance product competitiveness by creating an LPCAMM2 module for PCs and data centers, as well as 1cnm-based LPDDR5 and LPDDR6.
- As for other products, the business is getting ready to release UFS5.0, high-capacity QLC-based eSSD, and PCIe 6th generation SSD.
- In order to offer consumers the finest products, SK hynix is planning to implement a logic process on the base die from the HBM4 generation in partnership with a leading global logic foundry.
- In anticipation of a paradigm shift in artificial intelligence memory, customized HBM will be a high-performance solution that caters to specific user needs in terms of capacity, bandwidth, and functionality.
- To get around this "memory wall," SK hynix is working on a solution that augments memory with computational capabilities. A formidable obstacle that will alter the foundation of AI systems of the future and the course of the AI industry as a whole are technologies like Computational Storage, Processing in Memory (PIM), and Processing Near Memory (PNM), which are crucial for handling massive data sets in the future.


U.S. Raises Alarm Over Chinese AI Firms’ Alleged IP Theft Through Model Distillation
T-Mobile Beats Q1 Earnings Expectations on Strong Postpaid Growth
Why Paycom Was Named a 2026 Platinum Employer on the Where You Work Matters List
Taiwan Court Fines Tokyo Electron Unit $4.78M in Major TSMC Trade Secrets Case
Lightelligence IPO Soars Over 400% in Hong Kong Debut Amid Rising AI Investment Demand
China’s Ultra-Cheap EV Boom: Why Electric Cars Cost Far Less Than in the U.S.
Alphabet Earnings Surge on AI Growth, Cloud Revenue, and Strong Search Performance
U.S. Demand for Alternative Satellite Providers Remains Strong Amid SpaceX Regulatory Push
Meta Raises 2026 Capex Outlook Amid AI Spending Surge, Shares Drop After Earnings
U.S. Cybersecurity Pushes Faster Patch Deadlines Amid Rising AI-Driven Threats
Robinhood Q1 Earnings Miss Expectations, Stock Drops After Hours
Apple Q2 2026 Earnings Surge as iPhone 17 Sales Drive Record Revenue
FBI Warns of China’s Expanding Hack-for-Hire Network Amid Extradition Case
Advantest Stock Falls on Weak Outlook Despite Strong AI-Driven Results
Seagate Stock Surges After Strong Q3 Earnings Beat and Bullish Outlook
DeepSeek V4 Launch Signals China’s Growing AI Independence with Huawei Chips 



