Microsoft announced Thursday that it will integrate AMD's MI300X processors into its Azure cloud services, offering a competitive alternative to Nvidia's GPUs.
Azure's New AI Processing Power
According to Reuters, Microsoft announced on Thursday that it intends to provide its cloud computing clients with a platform of AMD artificial intelligence (AI) processors that will rival Nvidia components; further details will be provided at its Build developer conference the following week.
Clusters of the flagship MI300X AI processors from Advanced Micro Devices will be offered for sale via Microsoft's Azure cloud computing service. They will provide an alternative to the Nvidia H100 family of powerful GPUs, which are in high demand and can be difficult to procure for AI data center chip markets.
To develop AI models or execute applications, organizations must frequently aggregate multiple GPUs due to the insufficient capacity of a single processor to handle the required data and computation.
AMD's AI Market Growth
AMD, a key player in the AI processor market, is projecting a significant revenue of $4 billion for the current year. This optimistic outlook is backed by their claim that these chips can train and run large AI models, promising a new era of AI processing.
In addition to Nvidia's premium AI processors, Microsoft's cloud computing division offers access to its proprietary AI chips, known as Maia.
Microsoft's Build Conference to Reveal New AI Features
On the other hand, TechCrunch has reported that Scott Guthrie, executive vice president of Microsoft Cloud and AI business, specifically compared Cobalt to AWS's Graviton chips, which developers have been able to use for several years, at an analyst conference held in advance of Build.
According to Guthrie, Microsoft's chips would perform 40% better than competing ARM CPUs. Adobe, Snowflake, and other companies are already using the new chips.
Natural Language Integration in Azure with Copilot
Microsoft will also release a new feature that will let developers use natural language to control their Azure resources right from Copilot.
"That's going to enable an even tighter developer loop with natural language across your development stack and Azure," Guthrie said. As that system is constructed upon a common extensibility mechanism, more suppliers will be able to plug into it and provide comparable functionalities.
Photo: Praswin Prakashan/Unsplash


Samsung Electronics Posts Record Q4 2025 Profit as AI Chip Demand Soars
Ericsson Plans SEK 25 Billion Shareholder Returns as Margins Improve Despite Flat Network Market
China Approves First Import Batch of Nvidia H200 AI Chips Amid Strategic Shift
Elon Musk Shares Bold Vision for AI, Robots, and Space at Davos
NVIDIA, Microsoft, and Amazon Eye Massive OpenAI Investment Amid $100B Funding Push
Nintendo Stock Jumps as Switch 2 Becomes Best-Selling Console in the U.S. in 2025
Micron to Expand Memory Chip Manufacturing Capacity in Singapore Amid Global Shortage
Meta Faces Lawsuit Over Alleged Approval of AI Chatbots Allowing Sexual Interactions With Minors
ByteDance Finalizes Majority U.S.-Owned TikTok Joint Venture to Avert American Ban
SoftBank Shares Surge as AI Optimism Lifts Asian Tech Stocks
Google Halts UK YouTube TV Measurement Service After Legal Action
Rewardy Wallet Integrates 1inch Swap API to Enable Gasless, Optimized Token Swaps
Microsoft Wins Approval to Build 15 New Data Centers in Wisconsin
Intel Stock Slides Despite Earnings Beat as Weak Q1 Outlook Raises Concerns 



