AMD launched its Instinct MI325X AI chip on Thursday, positioning it as a direct competitor to Nvidia's upcoming Blackwell GPUs. This bold move could disrupt Nvidia’s data center dominance and intensify the AI chip war.
AMD Unveils AI Chip to Rival Nvidia’s Dominance
On Thursday, AMD unveiled a new AI chip that competes head-on with Nvidia's GPUs—the graphics processing units used in data centers.
At an event introducing the new product, AMD announced Thursday that production of the Instinct MI325X chip, as reported by CNBC, will begin before the end of 2024. Nvidia, which has witnessed strong demand for its GPUs and over 75% gross margins over the last year, could see pricing pressure from AMD's artificial intelligence chips if developers and cloud giants view them as near replacements for Nvidia's products.
Rising Demand for AI Chips Fuels Competition
More businesses are offering AI chips to meet the demand for advanced generative AI systems like OpenAI's ChatGPT, which need large data centers stocked with GPUs to handle the data.
Although AMD has always ranked second, Nvidia has recently cornered the market on graphics processing units (GPUs) for data centers. Now, AMD is determined to steal market share from its Silicon Valley competitor or at least secure a significant portion of the $500 billion market that it predicts will exist by 2028.
AMD’s Push for AI Market Share Intensifies
“AI demand has actually continued to take off and actually exceed expectations. It’s clear that the rate of investment is continuing to grow everywhere,” AMD CEO Lisa Su stated during the event.
At the presentation, AMD did not announce any new big cloud or internet customers for its Instinct GPUs. However, in the past, the company has mentioned that OpenAI uses its AI GPUs and that Meta and Microsoft purchase them. Since the Instinct MI325X is usually supplied in bundles with other server components, the business remained tight-lipped about its price.
Accelerating Product Timelines to Challenge Nvidia
To better compete with Nvidia and capitalize on the surge in AI chips, AMD is speeding up its product calendar with the debut of the MI325X. The company plans to deliver new chips on an annual timetable. The new AI chip will take over from the MI300X, which was released at the end of last year. The business announced that the 2025 and 2026 chips from AMD will be named MI350 and MI400, respectively.
The release of the MI325X will put it in competition with Nvidia's forthcoming Blackwell processors, which the graphics card manufacturer has promised will begin delivering in large quantities early in 2019.
Investor Interest Shifts as Competition Heats Up
Investors seeking more AI boom beneficiaries may take notice of AMD's next data center GPU if it has a good launch. Nvidia's stock increased by more than 175% in 2024, whilst AMD's stock only increased by 20%. When it comes to artificial intelligence chips for data centers, most experts agree that Nvidia controls over 90% of the market.
Thursday saw a 4% decline in AMD shares. Approximately 1% of Nvidia's stock was up.
ROCm Software as AMD’s Answer to Nvidia’s CUDA
Since its competitors' chips employ their proprietary language, CUDA, which has become the de facto standard for artificial intelligence developers, AMD faces a significant challenge in capturing market share. That forces programmers to use only the tools and resources provided by Nvidia.
As a countermeasure, AMD announced this week that it has been fine-tuning its competitor software, ROCm, to make it easier for AI developers to migrate more AI models to its accelerator processors.
AI Performance Gains with AMD’s Superior Memory
AMD has positioned its artificial intelligence (AI) accelerators to compete better in scenarios where AI models are generating predictions or content creation than when they are improving by digesting gigabytes of data. It said that this is because AMD's superior memory makes its CPUs faster than certain Nvidia chips, which in turn allows them to service Meta's Llama AI model.
According to Su, Meta's large language AI model, Llama 3.1, uses the H200, but the MI325 platform can provide inference performance that is up to 40% better.


SUPERFORTUNE Launches AI-Powered Mobile App, Expanding Beyond Web3 Into $392 Billion Metaphysics Market
Intel’s Testing of China-Linked Chipmaking Tools Raises U.S. National Security Concerns
MetaX IPO Soars as China’s AI Chip Stocks Ignite Investor Frenzy
EU Signals Major Shift on 2035 Combustion Engine Ban Amid Auto Industry Pressure
Korea Zinc to Build $7.4 Billion Critical Minerals Refinery in Tennessee With U.S. Government Backing
Apple Opens iPhone to Alternative App Stores in Japan Under New Competition Law
OpenAI Explores Massive Funding Round at $750 Billion Valuation
SpaceX Insider Share Sale Values Company Near $800 Billion Amid IPO Speculation
iRobot Files for Chapter 11 Bankruptcy Amid Rising Competition and Tariff Pressures
Delta Air Lines President Glen Hauenstein to Retire, Leaving Legacy of Premium Strategy
Ford Takes $19.5 Billion Charge as EV Strategy Shifts Toward Hybrids
Amazon in Talks to Invest $10 Billion in OpenAI as AI Firm Eyes $1 Trillion IPO Valuation
FDA Says No Black Box Warning Planned for COVID-19 Vaccines Despite Safety Debate
Moore Threads Stock Slides After Risk Warning Despite 600% Surge Since IPO
Evercore Reaffirms Alphabet’s Search Dominance as AI Competition Intensifies
Micron Technology Forecasts Surge in Revenue and Earnings on AI-Driven Memory Demand 



