At the Washington, DC-edition of its GTC conference, Nvidia today announced it is partnering with Uber to scale the world’s largest SAE Level 4-ready mobility network, using the ride-hailing company’s next-generation robotaxi and autonomous delivery fleets, the new Nvidia Drive AGX Hyperion 10 autonomous vehicle (AV) development platform, and Nvidia Drive AV software purpose-built for Level 4 autonomy.
“Robotaxis mark the beginning of a global transformation in mobility—making transportation safer, cleaner, and more efficient,” said Jensen Huang, Founder and CEO of Nvidia. “Together with Uber, we’re creating a framework for the entire industry to deploy autonomous fleets at scale, powered by Nvidia AI infrastructure.”
Uber is bringing together human drivers and AVs into a single operating network powered by Nvidia Drive AGX Hyperion-ready vehicles and the surrounding AI ecosystem, enabling the ride-hailing company to bridge today’s human-driven mobility with the autonomous fleets of tomorrow.
“Nvidia is the backbone of the AI era and is now fully harnessing that innovation to unleash L4 autonomy at enormous scale, while making it easier for Nvidia-empowered AVs to be deployed on Uber,” said Dara Khosrowshahi, CEO of Uber. “Autonomous mobility will transform our cities for the better, and we’re thrilled to partner with Nvidia to help make that vision a reality.”
Starting in 2027, Nvidia says it is supporting Uber in scaling its global autonomous fleet to 100,000 vehicles over time by enabling faster growth across the Level 4 ecosystem. These vehicles will be developed in collaboration with other Uber ecosystem partners using Nvidia Drive. Nvidia and Uber are also working together to develop a data factory accelerated by the Nvidia Cosmos world foundation model development platform to curate and process data needed for AV development.
Drive AGX Hyperion 10
The Drive AGX Hyperion 10 reference production computer and sensor set architecture announced at GTC DC is a serious upgrade for Nvidia and its customers.
“This is our most advanced, production-ready, compute and sensor architecture,” Ali Kani, Vice President of Automotive at Nvidia, told media in a pre-event briefing. “Our last version of Hyperion wasn’t designed to be Level 4 end-to-end. Now we’re architecting a system that can truly drive you from any address to any address. If you want to build a robotaxi-ready fleet or a Level 4 passenger car that can drive anywhere, you can just align to our architecture and really accelerate development.”
The Hyperion 10 production platform features two redundant Drive AGX Thor systems-on-a-chip based on Blackwell that are safety-certified to ASIL-D. Adding to that is the safety-certified Nvidia DriveOS operating system; a fully qualified multimodal sensor suite including 14 high-definition cameras; nine radars, one lidar, and 12 ultrasonics; and a qualified board design. By offering a prequalified sensor suite architecture, the company says the modular and customizable platform can accelerate development, lower costs, and give customers a running start with access to its rigorous development expertise and investments in automotive engineering and safety.
The two Drive AGX Thors each deliver more than 2000 FP4 teraflops (1000 TOPS of INT8) of real-time compute. Thor fuses diverse, 360-degree sensor inputs and is optimized for transformer, VLA (vision language action) models, and generative AI workloads. It enables safe, Level 4 autonomous driving backed by industry-leading safety certifications and cybersecurity standards.
Nvidia’s autonomous driving approach taps into foundation AI, large language, and generative AI models trained on trillions of real and synthetic driving miles. The advanced models are said to allow self-driving systems to solve highly complex urban driving situations with humanlike reasoning and adaptability.
“At the core of Drive is Cosmos, our world foundation model and data processing platform that we’ve open-sourced and recently updated to enable developers to better curate, search, generate, and scale the data that they need for testing and validation,” explained Kani. “In addition to generating data, we can also bring sensor data directly into simulation using Nvidia Omniverse NuRec neural rendering. Both NuRec rendering and Cosmos transfer, which add new weather, lighting, and terrain conditions to existing scenes, are now available on the open-source Carla AV simulator. We’re also introducing LidarGen, a generative model on Cosmos that creates lifelike lidar point clouds to let developers test and validate vehicle perception.”
New reasoning VLA models combine visual understanding, natural language reasoning, and action generation to enable human-level understanding in AVs. By running reasoning VLAs in the vehicle, the AV can interpret nuanced and unpredictable real-world conditions—such as sudden changes in traffic flow, unstructured intersections, and unpredictable human behavior—in real time. AV toolchain leader Foretellix is integrating its Foretify Physical AI toolchain with Nvidia Drive for testing and validating these models.
To enable the industry to develop and evaluate these large models for autonomous driving, Nvidia is releasing the world’s largest multimodal AV dataset. Comprising 1700 h of real-world camera, radar, and lidar data across 25 countries, the dataset is designed to bolster development, post-training, and validation of foundation models.
Part of the platform is Nvidia’s Halos system to deliver safety guardrails from cloud to car, establishing a holistic framework to enable safe, scalable autonomous mobility. The company’s Halos AI Systems Inspection Lab, dedicated to AI safety and cybersecurity across automotive and robotics applications, performs independent evaluations and oversees the new Halos Certified Program, helping ensure products and systems meet rigorous criteria for trusted physical AI deployments.
Companies such as Aumovio, Bosch, Nuro, and Wayve are among the inaugural members of the Halos AI System Inspection Lab, the industry’s first to be accredited by the ANSI Accreditation Board. The lab aims to accelerate the safe, large-scale deployment of Level 4 automated driving.
Lucid and Stellantis onboard
At GTC DC, Nvidia announced full-stack wins with two OEMs—Lucid and Stellantis—using the Hyperion architecture. Kani explained that “full stack” means “the entire software stack, including the active safety, parking, and driving, is developed, trained, and tested by Nvidia using our three-computer platform.
“Lucid is building its next-generation vehicles on Nvidia Drive AV and Hyperion 10, bringing Level 4 intelligence to luxury performance,” said Kani. “Stellantis will have a couple car lines, an SUV and a van, that will adopt Hyperion and be Level 4 ready. And the Mercedes-Benz S-Class will roll into be Hyperion-ready in the 2028 timeframe. These cars are the first Level 4 passenger cars announced to be built in the industry, so we’re super proud of these software-defined vehicles.”
Lucid plans to deliver one of the world’s first consumer-owned Level 4 autonomous vehicles by integrating Drive AGX Thor into its future midsize vehicles, enabling true “eyes-off, hands-off, mind-off” capabilities. The company’s ADAS and autonomous roadmap, aided by Nvidia Drive AV, begins with an eyes-on, point-to-point driving L2++ system for Gravity and the company’s upcoming midsize vehicles. The automaker is leveraging Nvidia’s industrial platform and Omniverse to optimize manufacturing, reduce costs, and accelerate delivery through intelligent robotics and digital twin technology.
In July, Lucid, Nuro, and Uber announced a premium global robotaxi program created for the Uber ride-hailing platform.
Stellantis is advancing its global robotaxi strategy with a new collaboration among Nvidia, Uber, and Foxconn designed to pave the way for expanding Stellantis’ growing ecosystem for Level 4 autonomous mobility worldwide. Combining vehicle engineering, AI computing, ride-hailing operations, and electronics into a scalable solution, Stellantis aims to build on its AV-ready platforms to deliver safe, efficient, and affordable robotaxi services and add to its recently announced collaboration with Pony.ai to advance robotaxi development in Europe.
Other leading global automakers, robotaxi companies, and Tier 1 suppliers are already working with Nvidia and Uber to launch Level 4 fleets.
“The AV software for Uber’s fleet spans the globe and includes Nvidia Hyperion 10 partners Momenta, Nuro, Pony.ai, Waabi, Wavye, WeRide, as well as our own Drive AV software,” said Kani.
In addition, Nvidia and Uber are supporting their shared partners, including Avride and May Mobility, across the worldwide ecosystem, developing their software stacks on the Nvidia Drive level 4 platform.
In trucking, Aurora and Volvo Autonomous Solutions are developing Level 4 autonomous trucks powered by the Nvidia Drive platform. Their next-generation systems, built on Drive AGX Thor, will accelerate Volvo’s upcoming Level 4 fleet.
- Nvidia Hyperion-equipped Level 4 Uber.
- Lucid’s first Level 4 vehicles for consumers will be future mid-sized model.
- Stellantis is advancing its global robotaxi strategy with Nvidia and Uber.








































































