Menu

Search

  |   Technology

Menu

  |   Technology

Search

Microsoft Integrates AMD AI Chips into Azure, Competing with Nvidia’s GPU Offerings

Microsoft introduces AMD AI chips on Azure, providing cloud customers an alternative to Nvidia’s GPUs.

Microsoft announced Thursday that it will integrate AMD's MI300X processors into its Azure cloud services, offering a competitive alternative to Nvidia's GPUs.

Azure's New AI Processing Power

According to Reuters, Microsoft announced on Thursday that it intends to provide its cloud computing clients with a platform of AMD artificial intelligence (AI) processors that will rival Nvidia components; further details will be provided at its Build developer conference the following week.

Clusters of the flagship MI300X AI processors from Advanced Micro Devices will be offered for sale via Microsoft's Azure cloud computing service. They will provide an alternative to the Nvidia H100 family of powerful GPUs, which are in high demand and can be difficult to procure for AI data center chip markets.

To develop AI models or execute applications, organizations must frequently aggregate multiple GPUs due to the insufficient capacity of a single processor to handle the required data and computation.

AMD's AI Market Growth

AMD, a key player in the AI processor market, is projecting a significant revenue of $4 billion for the current year. This optimistic outlook is backed by their claim that these chips can train and run large AI models, promising a new era of AI processing.

In addition to Nvidia's premium AI processors, Microsoft's cloud computing division offers access to its proprietary AI chips, known as Maia.

Microsoft's Build Conference to Reveal New AI Features

On the other hand, TechCrunch has reported that Scott Guthrie, executive vice president of Microsoft Cloud and AI business, specifically compared Cobalt to AWS's Graviton chips, which developers have been able to use for several years, at an analyst conference held in advance of Build.

According to Guthrie, Microsoft's chips would perform 40% better than competing ARM CPUs. Adobe, Snowflake, and other companies are already using the new chips.

Natural Language Integration in Azure with Copilot

Microsoft will also release a new feature that will let developers use natural language to control their Azure resources right from Copilot.

"That's going to enable an even tighter developer loop with natural language across your development stack and Azure," Guthrie said. As that system is constructed upon a common extensibility mechanism, more suppliers will be able to plug into it and provide comparable functionalities.

Photo: Praswin Prakashan/Unsplash

  • Market Data
Close

Welcome to EconoTimes

Sign up for daily updates for the most important
stories unfolding in the global economy.