At Nvidia‘s annual GTC taking place this week in San Jose, CA, company Founder and CEO Jensen Huang delivered a keynote at the SAP Center outlining his company’s latest full AI stack advances, from accelerated computing and AI factories to new developments in inference infrastructure, open models, agentic systems, and physical AI.
Among the major automotive announcements was that its Drive Hyperion platform adoption is growing with automakers such as BYD, Geely, Isuzu, and Nissan as well as leading mobility providers—reflecting rapid momentum toward safe, scalable AV (autonomous vehicle) development.
The company says that standardizing on Drive Hyperion—supported by its Halos OS safety architecture—enables its partners to accelerate validation cycles and streamline global deployment strategies. By using a standardized reference architecture that integrates compute, sensors, networking, and safety systems, they can achieve faster fleet learning and more efficient global scaling.
”The autonomous vehicle revolution is here—the first multitrillion-dollar robotics industry,” said Huang. “Everything that moves will eventually be autonomous. The Nvidia Hyperion platform and our Alpamayo open reasoning models give vehicles the ability to perceive their surroundings, reason through complex situations and act safely—making scalable, Level 4 autonomy possible.”
Strengthening the AV ecosystem
For those less acquainted with the breadth of Nvidia’s ecosystem, the company builds the infrastructure for physical AI spanning three computers, as well as the open models, libraries, and applications built on top of those. DGX trains the AI model; Omniverse and Cosmos on RTX simulate, test, and validate it; and AGX runs it on the machines at the edge.
In the lead up to GTC, Ali Kani, Vice President and General Manager of Automotive at Nvidia, briefed media on his company’s physical AI news for automotive and the growing strength of its AV ecosystem.
“Developers in every field of physical AI are building on our platform,” said Kani. “We’re super proud to have cultivated the industry’s largest AV ecosystem. Nvidia builds the world’s only full stack, safe AV platform that’s in production, at scale. Our Drive platform spans the architecture, safety systems, and AI infrastructure used to train and validate AVs. At its core is Hyperion, an L4-ready vehicle reference architecture with compute, sensors, and software that the AV ecosystem can build on.”
He says that, in the last few weeks, Waymo announced an expanded the robotaxi network to its 10th city. Zoox announced plans to expand to Las Vegas and Los Angeles with Uber. For both of these partners, Nvidia participates in both the car and cloud computer, Kani said. Nissan just a few days ago also announced a deployment with Wayve in Japan on Uber’s network, where Nvidia is involved in all three computers.
“The ChatGPT moment for physical AI has arrived in autonomous vehicles, as we’re seeing the first rollout of physical AI at scale in this segment,” said Kani.
Drive Hyperion for Level 4 vehicle programs
Nvidia says that Nissan, powered by Wayve software, is developing next-generation Level 4 AV programs built on its Drive Hyperion production-ready compute and sensor architecture.
Late last week, Wayve, Uber, and Nissan announced an MOU to collaborate on the development of robotaxis and commence activities to realize the deployment of robotaxi services. The parties will begin preparations for a pilot deployment in Tokyo by late 2026, introducing the Nissan Leaf powered by the Wayve AI Driver, available to riders through Uber.
This marks Uber’s first autonomous vehicle partnership in Japan and the next milestone in Wayve and Uber’s global robotaxi rollout, which includes planned services across more than ten cities worldwide, including London. Under this scheme, the goal is to integrate Wayve’s end-to-end AI autonomous driving system into Nissan’s base vehicle, which can accommodate the Wayve AI Driver and connect to Uber’s ride hailing platform, matching robotaxis with individuals seeking transportation.
Nvidia says that Uber is building one of the world’s most expansive autonomous ride-hailing networks powered by Drive Hyperion. Supported by a growing roster of automaker platforms, the partners today announced an expanded partnership to launch a fleet of AVs entirely powered by the full-stack Nvidia Drive AV software across 28 cities and four continents by 2028.
The rollout will begin with Los Angeles and the San Francisco Bay Area in the first half of 2027. This Drive Hyperion-powered fleet will tap into Nvidia’s Alpamayo open models and the Halos operating system to accelerate the development and deployment of safe, scalable robotaxi services worldwide.
Other mobility leaders including Bolt, Grab, and Lyft are also leveraging Nvidia Drive Hyperion to accelerate autonomous mobility initiatives, signaling broader industry momentum toward software-defined robotaxi fleets.
The recent news that Isuzu and Tier IV are collaborating on Level 4 autonomous bus development is underpinned by Nvidia’s Drive AGX Thor system-on-a-chip, part of Drive Hyperion.
Nvidia is also collaborating with Amazon to advance Alexa Custom Assistant with multimodal edge AI capabilities on Drive AGX accelerated compute, enabling automakers to deliver ambient in-cabin intelligence with privacy in mind and enhanced performance.
Advancing safety with Halos
Safety is the mission of autonomous driving, according to Kani.
“At GTC, we’re announcing Nvidia Halos OS, a unified software safety foundation for L4 autonomy on the Drive Hyperion platform,” he said. “This is a complete cloud to car safety ecosystem. It bridges infrastructure, our massive cloud-based simulation and AV development pipeline with a powerful three-layer in-car stack.”
First is the Halos core built on Nvidia’s industry-leading Drive OS to ensure the highest functional safety integrity in cars. Second is Halo’s SDK that allows developers to write highly portable applications that scale across any vehicle. Third is the application layer, which includes the company’s NCAP five-star active safety stack, ensuring that advanced reasoning models like Alpamayo always operate predictably and safely.
“By combining the cloud with the car, Halos is the framework for safely scaling autonomous vehicles,” claims Kani.
Halos’ unified, three-layer safety architecture integrates safety middleware and deployable safety applications—including an NCAP five-star active safety stack to provide the guardrails that enable reasoning-based AI systems to operate with verifiable, automotive-grade integrity at scale. An example of this is Nvidia Drive AV enabling the new Mercedes-Benz CLA to earn top Euro NCAP Award honors.
To continuously validate and support the rigorous AV safety ecosystem, AEye, Flex, Gatik, Hesai, Lucid, MIRA, PlusAI, Qt Group, Saphira, and Valeo are joining the Nvidia Halos AI Systems Inspection Lab.
Nvidia says its Halos certification is the industry’s first program dedicated to tuning and optimizing safety in physical AI deployments, including AVs and robotics. The program ensures that AI products have passed rigorous safety, cybersecurity, and AI safety inspections so they can be more easily integrated into the Halos technology stack.
Alpamayo updated to Version 1.5
At GTC, Nvidia is also launching and open sourcing a new version of Alpamayo, its portfolio of AI models, simulation frameworks, and datasets designed to enable Level 4 autonomous driving through reasoning-based, human-like judgment. Version 1.5 follows 1.0, released at CES 2026, which already has over 150,000 downloads.
According to Kami, Alpamayo processes video, motion, and text prompts to generate driving trajectories, complete with clear reasoning traces. This latest release includes support for navigation inputs, as well as post-training scripts to help developers kickstart their workflows.
Version 1.5 takes driving video, ego-motion history, navigation guidance, and natural language prompts as inputs, then outputting driving trajectories with reasoning traces. This enables developers to steer behavior and specify constraints directly through navigation and text prompts. In addition, the Alpamayo portfolio now includes post-training scripts to enable model adaptation for researchers and developers.
With Alpamayo 1.5, vehicles can more effectively learn from rare or unpredictable events—such as unusual road hazards and complex human behavior—by replaying scenarios, querying model decisions and applying updated behavioral guidance through prompts and navigation settings.
The model also adds flexible multi-camera support and configurable camera parameters, simplifying reuse of the same AI driving stack across vehicle lines and sensor configurations while preserving compatibility with existing Alpamayo integrations.
Since launching earlier this year, Alpamayo has been downloaded by more than 100,000 automotive developers worldwide.
Accelerated reasoning with NuRec
Nvidia says that testing and validating reasoning-based AVs requires high-fidelity simulation that covers the diversity of real-world driving.
At GTC, Nvidia also announced that its Omniverse NuRec is now open sourced and generally available. The set of 3D Gaussian Splatting technologies ingest real-world data to reconstruct and render interactive simulation that address this.
“It allows developers to quickly stress test models like Alpamayo bypassing the time and cost of manual world building,” said Kani.
Leading AV toolchain providers such as 51WORLD, dSPACE, and Foretellix have integrated NuRec into their simulation solutions.
51WORLD says its 51Sim’s product-level integration of NuRec enables real-world fleet data to be reconstructed into interactive neural network scenarios and directly used for closed-loop simulation of autonomous driving systems, providing automakers and autonomous driving algorithm companies with a new data-driven simulation capability. Through this system, real-world road data collected by fleets is no longer just offline data assets but can be continuously transformed into a runnable simulation environment, achieving an efficient closed loop from real roads to simulation verification.
On the customer front, Voxel51 is using NuRec in its Physical AI Workbench for customers such as Porsche Research. Synthetic data generation engine startup Parallel Domain is using the NuRec Fixer model to enhance its reconstruction pipeline. Mcity, an AV research facility run by the University of Michigan, is using NuRec to build a Gaussian-based digital twin of its physical test track for the AV industry and research community.
- Last week, Wayve, Uber, and Nissan announced a robotaxi collaboration.
- Last week, Wayve, Uber, and Nissan announced a robotaxi collaboration.
- Zoox robotaxis will be deployed with Uber in Las Vegas and Los Angeles.
- BYD, Geely, Isuzu and Nissan adopt Nvidia Drive Hyperion for Level 4 vehicles.
- Nvidia expands global Drive Hyperion ecosystem to accelerate the road to full autonomy.
- Nvidia Drive AV on the Mercedes-Benz CLA helped it earn top Euro NCAP Award.






















































































