Advanced Micro Devices (AMD) has projected a significant expansion in the market for data center artificial intelligence (AI) processors, estimating the total addressable market at $45 billion for the current year. This forecast represents a notable increase from the $30 billion market size calculated by the company in June.
AMD Aims for $2 Billion in Sales by 2024
AMD's announcement came alongside the launch of its new generation of AI chips, which are part of the MI300 series. The lineup includes two key models: one tailored for generative AI applications and another designed for supercomputers. The generative AI-focused chip, named the MI300X, features advanced high-bandwidth memory, enhancing its performance capabilities.
In terms of future sales expectations, AMD's CEO Lisa Su revealed, per Market Screener, that the company has secured a significant supply of AI chips for the upcoming year, valued at well over $2 billion. This statement underlines AMD's preparedness to meet the growing demand in the AI chip market.
AMD's strategic move positions it to compete more aggressively with Nvidia, which currently holds a dominant share in the AI chip market. Nvidia, known for its stronghold in this sector, has reported substantial data center revenue but has not specifically broken out its AI revenue.
Analysts estimate that Nvidia and custom processors built by companies like Alphabet's Google and Microsoft account for approximately 80% of the AI chip market.
The growth trajectory of the AI chip market is expected to be steep, with AMD projecting the market size to reach around $400 billion by 2027. This expectation reflects the rapidly increasing demand for AI chips, particularly in data center applications.
AMD Unveils Cutting-Edge Chips for Accelerated AI Training Speeds
The launch of AMD's new processors and accompanying software updates signifies the company's commitment to capturing a significant share of this burgeoning market.
To recall, AMD has unveiled its new series of advanced chips, the MI300X, designed to enhance artificial intelligence (AI) training speed significantly. The Verge reported that the MI300X is a breakthrough in chip technology, boasting an impressive 192GB HBM3 memory and a peak theoretical memory bandwidth of 5.3 TB/s. This places the MI300X ahead of its competitors in terms of memory and computing capabilities, offering more than double the memory of Nvidia's H100 chip and 1.3 times more computing power.
The MI300X's design allows for increased efficiency in running AI models, enabling the training and inference of larger or more models on the same server. This advancement is crucial for the most demanding data center workloads, particularly in the realm of generative AI, which requires significant computational resources to train and refine models with billions of parameters.
AMD's release of the MI300X also marks a significant step in the company's strategy to become a major player in the PC-based AI market. Alongside the MI300X, AMD introduced the Ryzen 8040 CPUs, which are said to offer the most powerful on-CPU AI capabilities to date.
Photo: Timothy Dykes/Unsplash


Texas App Store Age Verification Law Blocked by Federal Judge in First Amendment Ruling
Starlink Plans Satellite Orbit Reconfiguration in 2026 to Boost Space Safety
Applied Digital Stock Rises on AI Cloud Spinoff Plan and ChronoScale Launch
Italy Fines Apple €98.6 Million Over App Store Dominance
Samsung Electronics Secures Annual U.S. Licence for China Chip Equipment Imports in 2026
FTC Praises Instacart for Ending AI Pricing Tests After $60M Settlement
Micron Technology Forecasts Surge in Revenue and Earnings on AI-Driven Memory Demand
ByteDance Plans Massive AI Chip Spending Boost as Nvidia Demand Grows in China
TikTok U.S. Deal Advances as ByteDance Signs Binding Joint Venture Agreement
TSMC Honors Japanese Chip Equipment Makers With 2025 Supplier Awards
China’s LandSpace Takes Aim at SpaceX With Reusable Rocket Ambitions
SoftBank Completes $41 Billion OpenAI Investment in Historic AI Funding Round
Google Accelerates AI Infrastructure With Ironwood TPU Expansion in 2026
Moore Threads Unveils New GPUs, Fuels Optimism Around China’s AI Chip Ambitions
Neuralink Plans Automated Brain Implant Surgeries and Mass Production by 2026 



