Skip to content

Microsoft and AMD Collaborate to Revolutionize AI Chips: Disrupting Nvidia's Dominance

Microsoft and AMD join forces to develop powerful AI chips, aiming to challenge Nvidia's market dominance and revolutionize the AI processor landscape.

Microsoft is said to be collaborating with AMD to bolster the chipmaker's entry into artificial intelligence (AI) processors. The partnership is designed to challenge Nvidia's market dominance, as the company currently holds an estimated 80% share of the AI processor market. Microsoft is reportedly providing engineering resources to assist AMD's advancements in the space.

Sources claim that AMD is also aiding Microsoft in the development of its own in-house AI chips, codenamed Athena. Hundreds of employees from Microsoft's silicon division are said to be working on the project, with the company having invested around $2 billion in its development. Microsoft spokesperson Frank Shaw, however, has denied AMD's involvement with Athena.

The growing demand for AI services, such as OpenAI's ChatGPT, has increased the need for processors capable of handling massive computational workloads. Nvidia's market share in graphic processing units (GPUs) – specialized chips providing the required computing power – allows it to dominate this space. Currently, no suitable alternative exists, posing a problem for companies like Microsoft that require Nvidia's costly processors to power various AI services within its Azure Cloud.

Although AMD is a significant competitor in the gaming hardware industry, it lacks a suitable alternative to Nvidia's CUDA ecosystem for large-scale machine learning deployments. With the AI industry heating up, AMD aims to position itself to capitalize on the growth. CEO Lisa Su said during an earnings call, "We are very excited about our opportunity in AI — this is our number one strategic priority. We are in the very early stages of the AI computing era, and the rate of adoption and growth is faster than any other technology in recent history."

Su claims that AMD is well-positioned to create partly customized chips for major customers to use in their AI data centers. "I think we have a very complete IP portfolio across CPUs, GPUs, FPGAs, adaptive SoCs, DPUs, and a very capable semi-custom team," Su added, highlighting the company's prospects for "higher volume opportunities beyond game consoles."

AMD is also confident that its upcoming Instinct MI300 data center chip could be adapted for generative AI workloads. "MI300 is actually very well-positioned for both HPC or supercomputing workloads as well as for AI workloads," said Su. "And with the recent interest in generative AI, I would say the pipeline for MI300 has expanded considerably here over the last few months, and we're excited about that. We're putting in a lot more resources."

As Microsoft continues to work closely with Nvidia to secure more processors, the AI boom has led to a growing shortage of specialized GPU chips, further constrained by Nvidia's near monopoly on supply. Microsoft and AMD aren't the only players attempting to develop in-house AI chips – Google has its own TPU (Tensor Processing Unit) chip for training AI models, and Amazon has created Trainium AI chips for training machine learning computer models.

Latest