Uber launches ‘AV Labs’ division to collect driving data for robotaxi partners


Uber has more than 20 autonomous car partners, and they all want one thing: data. So the company States it will be made available through a new division called Uber AV Labs.

Despite the name, Uber not returned to developing its own robotaxis, which it halted after one of its test vehicles killed a pedestrian in 2018. (Uber eventually sold the division in 2020 to a complex deal with Aurora.) But it will send its own cars to cities equipped with sensors to collect data for partners such as Waymo, Waabi, Lucid Motors, and others – although no contract has yet been signed.

Generally speaking, self-driving cars are in the middle of a transition away from rules-based operation and toward relying more on reinforcement learning. When that happens, real-world driving data becomes increasingly valuable for training these systems.

Uber told TechCrunch that the autonomous car companies that want this data are the ones that already collect a lot of it themselves. This is a sign that, like most frontier AI labs, they know that “solving” the worst edge cases is a volume game.

A physical limitation

Currently, the size of an autonomous vehicle company’s fleet creates a physical limit to how much data it can collect. And while many of these companies create simulations of real-world environments to hedge against edge cases, nothing beats driving on actual roads – and driving a lot – when it comes to discovering all the strange, difficult, and flat out unexpected scenarios that cars run into.

Waymo provides an example of this gap. The company has had autonomous vehicles operating or in testing for a decade, and yet the current robotaxis has only recently caught on. illegally passing stopped school buses.

Access to a larger pool of driving data will help robotaxi companies solve some of the problems before or while they’re crawling, Uber’s chief technology officer Praveen Neppalli Naga told TechCrunch in an exclusive interview.

Techcrunch event

San Francisco
|
October 13-15, 2026

And Uber doesn’t charge for it. At least not yet.

“Our goal, basically, is to democratize this data, right? I mean, the value of this data and the development of AV technology with partners is much greater than the money we can make from it,” he said.

Uber’s VP of engineering Danny Guo said the lab must first establish a basic data foundation before it can determine the product’s market fit. “Because if we don’t do it, we don’t believe anyone can,” Guo said. “So as someone who can open up the whole industry and facilitate the whole ecosystem, we believe we have to face this responsibility now.”

Screws and sensors

AV Labs’ new division started small. So far, it only has one car (a Hyundai Ioniq 5, though Uber says it’s not married to a single model), and Guo told TechCrunch that his team is literally screwing up sensors like lidars, radars, and cameras.

“We don’t know if the sensor kit will fall off, but that’s our downside,” he said with a laugh. “I think it will take some time for us to be able to say, deploy 100 cars on the road to start collecting data. But the prototype is there.”

Partners cannot receive raw data. Once the Uber AV Labs fleet is up and running, Naga said the division “needs to massage and work with data to help partners.” This “semantic understanding” layer is what the driving software of companies like Waymo can tap into to improve the path planning of a robotaxi.

However, Guo said there will likely be an interstitial step to be taken, where Uber will plug a partner’s driving software into AV Labs’ cars to run in “shadow mode.” Anytime an Uber AV Labs driver does something different from what the autonomous car software is doing in shadow mode, Uber will flag that to its partner company.

This will not only help discover deficiencies in the driving software, but also help train the models to drive more like a human and less like a robot, Guo said.

The Tesla method

If this approach sounds familiar, it’s because it’s essentially what Tesla has been doing to train its own autonomous car software for the past decade. Uber’s approach lacks the same scale, however, because Tesla has millions of customer cars driving on roads around the world every day.

That doesn’t bother Uber. Guo said he hopes to do more targeted data collection based on the needs of autonomous car companies.

“We have 600 cities that we can pick and choose (from). If the partner tells us a particular city they are interested in, we can just deploy our (vehicles),” he said.

Naga said the company expects to grow this new division to several hundred people within a year, and that Uber wants to move quickly. And while he sees a future where Uber’s entire fleet of ice-riding vehicles can be used to collect even more training data, he knows the new division has to start somewhere.

“From our conversations with our colleagues, they just say: ‘give us anything that helps.’ Because the amount of data that Uber can collect is more than anything they can do with their own data collection,” Guo said.



Source link

  • Related Posts

    Apple buys Israeli startup QAI as AI race heats up

    Apple, Meta, and Google are locked in a fierce battle to lead the next wave of AI, and they’ve recently increased their focus on hardware. With the latest acquisition of…

    AI models that simulate internal debate can improve the accuracy of complex tasks

    A new Google study suggests that advanced reasoning models can achieve high performance by simulating multi-agent-like debates involving different perspectives, personality traits, and domain expertise. Their experiments show that this…

    Leave a Reply

    Your email address will not be published. Required fields are marked *