Microsoft this week deployed the first crop of its homegrown AI chips in one of its data centers, with plans to roll out more in the coming months, it says.
The chip, named Maia 200, is designed to be what Microsoft calls an “AI inference powerhouse” as Microsoft describes it, meaning it’s optimized for the compute-intensive work of running AI models in production. The company released some impressive processing-speed specs for the Maia, saying it’s better Amazon’s latest Trainium chips and Google’s latest Tensor Processing Units (TPU).
All the cloud giants have turned to their own AI chip designs in part because of the difficulty, and cost, of getting the latest and greatest from Nvidia — a supply crunch that shows no signs of letting up.
But even with its own state-of-the-art, high-performance chip in hand, Microsoft CEO Satya Nadella said the company will still buy chips made by others.
“We have a good partnership with Nvidia, with AMD. They innovate. We innovate,” he explained. “I think a lot of people are going to talk about who’s on top. Just remember, you’ve got to be on top all the time.”
He added: “Because we can vertically integrate does not mean that we only vertically integrate,” meaning to build his own systems from top to bottom, without using products from other vendors.
As such, the Maia 200 will be used by Microsoft’s self-called Superintelligence team, the AI specialists who have built their own models on the software giant’s frontier. Like that Mustafa Suleymanthe former Google DeepMind co-founder who now leads the team. Microsoft is working on its own models to perhaps one day reduce its reliance on OpenAI, Anthropic, and other model makers.
Techcrunch event
Boston, MA
|
June 23, 2026
The Maia 200 chip will also support OpenAI models running on Microsoft’s Azure cloud platform, the company said. But, by all accounts, ensuring access to the most advanced AI hardware still a challenge for everyone, paying customers and internal teams.
Therefore in an X postSuleyman was clearly enjoying sharing the news that his team got first dibs. “It’s a big day,” he wrote when the chip was launched. “Our Superintelligence team will be the first to use the Maia 200 as we develop our frontier AI models.”






