Microsoft announced Thursday that it will integrate AMD's MI300X processors into its Azure cloud services, offering a competitive alternative to Nvidia's GPUs.
Azure's New AI Processing Power
According to Reuters, Microsoft announced on Thursday that it intends to provide its cloud computing clients with a platform of AMD artificial intelligence (AI) processors that will rival Nvidia components; further details will be provided at its Build developer conference the following week.
Clusters of the flagship MI300X AI processors from Advanced Micro Devices will be offered for sale via Microsoft's Azure cloud computing service. They will provide an alternative to the Nvidia H100 family of powerful GPUs, which are in high demand and can be difficult to procure for AI data center chip markets.
To develop AI models or execute applications, organizations must frequently aggregate multiple GPUs due to the insufficient capacity of a single processor to handle the required data and computation.
AMD's AI Market Growth
AMD, a key player in the AI processor market, is projecting a significant revenue of $4 billion for the current year. This optimistic outlook is backed by their claim that these chips can train and run large AI models, promising a new era of AI processing.
In addition to Nvidia's premium AI processors, Microsoft's cloud computing division offers access to its proprietary AI chips, known as Maia.
Microsoft's Build Conference to Reveal New AI Features
On the other hand, TechCrunch has reported that Scott Guthrie, executive vice president of Microsoft Cloud and AI business, specifically compared Cobalt to AWS's Graviton chips, which developers have been able to use for several years, at an analyst conference held in advance of Build.
According to Guthrie, Microsoft's chips would perform 40% better than competing ARM CPUs. Adobe, Snowflake, and other companies are already using the new chips.
Natural Language Integration in Azure with Copilot
Microsoft will also release a new feature that will let developers use natural language to control their Azure resources right from Copilot.
"That's going to enable an even tighter developer loop with natural language across your development stack and Azure," Guthrie said. As that system is constructed upon a common extensibility mechanism, more suppliers will be able to plug into it and provide comparable functionalities.
Photo: Praswin Prakashan/Unsplash


Nvidia Earnings Preview: AI Chip Demand, Data Center Growth and Blackwell Shipments in Focus
Pentagon Weighs Supply Chain Risk Designation for Anthropic Over Claude AI Use
Samsung Stock Hits Record High on Nvidia HBM4 Supply Deal, Boosting AI Chip Rally
Hyundai Motor Plans Multibillion-Dollar Investment in Robotics, AI and Hydrogen in South Korea
Meta Signs Multi-Billion Dollar AI Chip Deal With Google to Power Next-Gen AI Models
Microsoft Gaming Leadership Overhaul: Phil Spencer Retires, Asha Sharma Named New Xbox CEO
Nvidia Earnings Beat Expectations as AI Demand Surges, Stock Rises on Strong Revenue Outlook
Synopsys Q2 Revenue Forecast Misses Expectations Amid China Export Curbs and AI Shift
Trump Orders Federal Agencies to Halt Use of Anthropic AI Technology
OpenAI Hires Former Meta and Apple AI Leader Ruomin Pang Amid Intensifying AI Talent War
Coupang Reports Q4 Loss After Data Breach, Revenue Misses Estimates
Apple to Begin Mac Mini Production in Texas Amid $600 Billion U.S. Investment Plan
xAI’s Grok Secures Pentagon Deal for Classified Military AI Systems Amid Anthropic Dispute
DeepSeek AI Model Trained on Nvidia Blackwell Chip Sparks U.S. Export Control Concerns
Federal Judge Blocks Virginia Social Media Age Verification Law Over First Amendment Concerns 



