All the Messy Drama Between OpenAI and Nvidia, Explained



OpenAI and Nvidia, the two darlings of the AI ​​hype and longtime partners, seem to have had a bit of a falling out.

At the center of this rift is Nvidia’s $100 billion investment in OpenAI announced in September 2025. As part of the deal, Nvidia will build 10 gigawatts of AI data centers for OpenAI and invest $100 billion in the company in 10 installments, as each gigawatt comes online. In turn, OpenAI is reportedly planning to use billions of dollars in investment from Nvidia to rent Nvidia chips.

At that time, investment boomed ANXIETY of circular dealmaking in the AI ​​industry and an intricately woven web of financial dependencies that could be a sign of potential instabilityechoed by the dotcom bubble. In other words, if even one cog goes wrong and demand doesn’t work as expected, it can create a domino effect which eliminates the entire system.

In the September announcement, the companies said that the first gigawatt of computing power would come online in the second half of 2026 and that any other details would be finalized in the coming weeks. But in an Nvidia SEC filing from the noVeMberthe investment in OpenAI is still described as “a letter of intent with an opportunity to invest.”

Flash forward a couple of months, and a Wall Street Journal The report from last week said that the talks have not progressed beyond the first stages and that Nvidia CEO Jensen Huang has privately criticized a so-called lack of discipline in OpenAI’s business approach. Huang has reportedly spent the past few months privately stressing to industry partners that the $100 billion deal is incomplete and unfinished.

After that report, Huang tried to reassure reporters in Taipei, Taiwan, through praise OpenAI and said Nvidia will be “fully involved” in the company’s latest round of funding going forward a rumored IPO later this year. Huang described the planned investment as “probably the biggest investment we’ve ever made,” but when asked if it would exceed $100 billion, he said, “No, no, nothing like that.”

But that wasn’t enough to quell investor fears, as another anonymously sourced report dropped a few days later. OpenAI is unhappy with the speed at which Nvidia chips can compute inference for some ChatGPT requests, and is looking for alternative chip providers (such as startups. BRAINS and Groq) to get 10% of his inference needs, according to a Reuters report on Tuesday.

The report also claims that OpenAI blames some of its AI coding assistants Codexes weaknesses in Nvidia hardware.

In response, it’s now the turn of OpenAI executives to praise Nvidia. CEO Sam Altman took to X to say that Nvidia makes “the best AI chips in the world,” and executive Sachin Katti says Nvidia is “OpenAI’s most important partner for training and inference.”

But it seems that inference and its heavy memory requirements have also weighed on Nvidia lately. The importance of inference goes beyond training as the models mature. Agent AI hype also increases the amount of data handled by an AI system during the inference stage, further driving the importance of memory.

To top it all off, Nvidia bought Groq (no, not Grok), the AI ​​chips startup that OpenAI is reportedly looking at, in its biggest purchase ever. Then, last month, Nvidia unveiled its new Rubin platformwith a presentation boasting inference and memory bandwidth wins.

Google upped the ante

Reportedly, at the center of both Nvidia’s and OpenAI’s fears about each other is the increase in competition, which is being done especially by Google.

Late last year, Google became a fair fiercer competitor of both leading AI developer OpenAI and leading hardware infrastructure giant Nvidia.

First came the tensor processing units, Google’s conventional AI chips which is designed for inference, and for some tasks is CONSIDERED better than the GPU chips dominated by Nvidia offerings. Google’s TPUs are not only used in its own AI models, but also deployed by competitor OpenAI Anthropic and possibly Meta.

According to a Wall Street Journal report from last week, Huang is also concerned about the competition both Google and Anthropic pose for dominance in the OpenAI market. Huang reportedly fears that if OpenAI is abandoned, it could affect Nvidia’s sales as the company is one of the chipmaker’s biggest customers.

OpenAI should declare “code red” in December, just a few weeks after Google’s latest release, Gemini 3, was considered above performance ChatGPT. Meanwhile, the company is also making significant efforts to scale Codex beat the most famous coding agent Anthropic Claude Code.

If investor fears come true, the deal doesn’t go as planned, and OpenAI can’t pay for its high financial commitments, then the implications will go beyond OpenAI and Nvidia. That’s because the two companies sit at the center of an intricate, tangled web of AI dealmaking, with several multibillion-dollar deals among some companies, including a $300 billion OpenAI-Oracle cloud deal greater than Nvidia’s commitment. These deals are a huge boon for the American economy, and if one deal goes down, it can take all of them.



Source link

  • Related Posts

    Exclusive: Positron raises $230M Series B to acquire Nvidia’s AI chips

    Semiconductor startup Positron has secured $230 million in Series B funding, TechCrunch has learned exclusively. The outfit plans to use the capital to accelerate the deployment of its high-speed memory…

    The Dyson PencilVac is finally available and costs $600

    the Dyson PencilVac stick vacuum finally available for purchase in the US after being revealed . It costs $600. The company says it’s the “world’s slimmest vacuum cleaner.” We haven’t…

    Leave a Reply

    Your email address will not be published. Required fields are marked *