Microsoft will not stop buying AI chips from Nvidia, AMD, even after launching its own, Nadella said


Microsoft this week deployed the first crop of its homegrown AI chips in one of its data centers, with plans to roll out more in the coming months, it says.

The chip, named Maia 200, is designed to be what Microsoft calls an “AI inference powerhouse” as Microsoft describes it, meaning it’s optimized for the compute-intensive work of running AI models in production. The company released some impressive processing-speed specs for the Maia, saying it’s better Amazon’s latest Trainium chips and Google’s latest Tensor Processing Units (TPU).

All the cloud giants have turned to their own AI chip designs in part because of the difficulty, and cost, of getting the latest and greatest from Nvidia — a supply crunch that shows no signs of letting up.

But even with its own state-of-the-art, high-performance chip in hand, Microsoft CEO Satya Nadella said the company will still buy chips made by others.

“We have a good partnership with Nvidia, with AMD. They innovate. We innovate,” he explained. “I think a lot of people are going to talk about who’s on top. Just remember, you’ve got to be on top all the time.”

He added: “Because we can vertically integrate does not mean that we only vertically integrate,” meaning to build his own systems from top to bottom, without using products from other vendors.

As such, the Maia 200 will be used by Microsoft’s self-called Superintelligence team, the AI ​​specialists who have built their own models on the software giant’s frontier. Like that Mustafa Suleymanthe former Google DeepMind co-founder who now leads the team. Microsoft is working on its own models to perhaps one day reduce its reliance on OpenAI, Anthropic, and other model makers.

Techcrunch event

Boston, MA
|
June 23, 2026

The Maia 200 chip will also support OpenAI models running on Microsoft’s Azure cloud platform, the company said. But, by all accounts, ensuring access to the most advanced AI hardware still a challenge for everyone, paying customers and internal teams.

Therefore in an X postSuleyman was clearly enjoying sharing the news that his team got first dibs. “It’s a big day,” he wrote when the chip was launched. “Our Superintelligence team will be the first to use the Maia 200 as we develop our frontier AI models.”



Source link

  • Related Posts

    ‘Uncanny Valley’: Misinformation in Minneapolis, TikTok’s New Owner, and Moltbot Hype

    In today’s episode, Hosts Brian Barrett and Zoë Schiffer are joined by Tim Marchman, director of science, politics, and security at WIRED, to discuss the week’s news—including how far-right influencers…

    Amazon is reportedly in talks to invest $50 billion in OpenAI

    OpenAI, is now a company worth $500 billionindicates that it is wanted another $100 billion to invest. Such a round of funding could lead the company’s valuation to reach a…

    Leave a Reply

    Your email address will not be published. Required fields are marked *