AMD launched its Instinct MI325X AI chip on Thursday, positioning it as a direct competitor to Nvidia's upcoming Blackwell GPUs. This bold move could disrupt Nvidia’s data center dominance and intensify the AI chip war.
AMD Unveils AI Chip to Rival Nvidia’s Dominance
On Thursday, AMD unveiled a new AI chip that competes head-on with Nvidia's GPUs—the graphics processing units used in data centers.
At an event introducing the new product, AMD announced Thursday that production of the Instinct MI325X chip, as reported by CNBC, will begin before the end of 2024. Nvidia, which has witnessed strong demand for its GPUs and over 75% gross margins over the last year, could see pricing pressure from AMD's artificial intelligence chips if developers and cloud giants view them as near replacements for Nvidia's products.
Rising Demand for AI Chips Fuels Competition
More businesses are offering AI chips to meet the demand for advanced generative AI systems like OpenAI's ChatGPT, which need large data centers stocked with GPUs to handle the data.
Although AMD has always ranked second, Nvidia has recently cornered the market on graphics processing units (GPUs) for data centers. Now, AMD is determined to steal market share from its Silicon Valley competitor or at least secure a significant portion of the $500 billion market that it predicts will exist by 2028.
AMD’s Push for AI Market Share Intensifies
“AI demand has actually continued to take off and actually exceed expectations. It’s clear that the rate of investment is continuing to grow everywhere,” AMD CEO Lisa Su stated during the event.
At the presentation, AMD did not announce any new big cloud or internet customers for its Instinct GPUs. However, in the past, the company has mentioned that OpenAI uses its AI GPUs and that Meta and Microsoft purchase them. Since the Instinct MI325X is usually supplied in bundles with other server components, the business remained tight-lipped about its price.
Accelerating Product Timelines to Challenge Nvidia
To better compete with Nvidia and capitalize on the surge in AI chips, AMD is speeding up its product calendar with the debut of the MI325X. The company plans to deliver new chips on an annual timetable. The new AI chip will take over from the MI300X, which was released at the end of last year. The business announced that the 2025 and 2026 chips from AMD will be named MI350 and MI400, respectively.
The release of the MI325X will put it in competition with Nvidia's forthcoming Blackwell processors, which the graphics card manufacturer has promised will begin delivering in large quantities early in 2019.
Investor Interest Shifts as Competition Heats Up
Investors seeking more AI boom beneficiaries may take notice of AMD's next data center GPU if it has a good launch. Nvidia's stock increased by more than 175% in 2024, whilst AMD's stock only increased by 20%. When it comes to artificial intelligence chips for data centers, most experts agree that Nvidia controls over 90% of the market.
Thursday saw a 4% decline in AMD shares. Approximately 1% of Nvidia's stock was up.
ROCm Software as AMD’s Answer to Nvidia’s CUDA
Since its competitors' chips employ their proprietary language, CUDA, which has become the de facto standard for artificial intelligence developers, AMD faces a significant challenge in capturing market share. That forces programmers to use only the tools and resources provided by Nvidia.
As a countermeasure, AMD announced this week that it has been fine-tuning its competitor software, ROCm, to make it easier for AI developers to migrate more AI models to its accelerator processors.
AI Performance Gains with AMD’s Superior Memory
AMD has positioned its artificial intelligence (AI) accelerators to compete better in scenarios where AI models are generating predictions or content creation than when they are improving by digesting gigabytes of data. It said that this is because AMD's superior memory makes its CPUs faster than certain Nvidia chips, which in turn allows them to service Meta's Llama AI model.
According to Su, Meta's large language AI model, Llama 3.1, uses the H200, but the MI325 platform can provide inference performance that is up to 40% better.


Trump Backs Nexstar–Tegna Merger Amid Shifting U.S. Media Landscape
Instagram Outage Disrupts Thousands of U.S. Users
Uber Ordered to Pay $8.5 Million in Bellwether Sexual Assault Lawsuit
Nasdaq Proposes Fast-Track Rule to Accelerate Index Inclusion for Major New Listings
American Airlines CEO to Meet Pilots Union Amid Storm Response and Financial Concerns
Baidu Approves $5 Billion Share Buyback and Plans First-Ever Dividend in 2026
Elon Musk’s Empire: SpaceX, Tesla, and xAI Merger Talks Spark Investor Debate
FDA Targets Hims & Hers Over $49 Weight-Loss Pill, Raising Legal and Safety Concerns
Sam Altman Reaffirms OpenAI’s Long-Term Commitment to NVIDIA Amid Chip Report
Ford and Geely Explore Strategic Manufacturing Partnership in Europe
Global PC Makers Eye Chinese Memory Chip Suppliers Amid Ongoing Supply Crunch
SpaceX Updates Starlink Privacy Policy to Allow AI Training as xAI Merger Talks and IPO Loom
SpaceX Pushes for Early Stock Index Inclusion Ahead of Potential Record-Breaking IPO
Sony Q3 Profit Jumps on Gaming and Image Sensors, Full-Year Outlook Raised
Nvidia, ByteDance, and the U.S.-China AI Chip Standoff Over H200 Exports
Prudential Financial Reports Higher Q4 Profit on Strong Underwriting and Investment Gains 



