By Shannon Carroll
Publication Date: 2026-01-06 18:00:00
SAN FRANCISCO — It’s a gorgeous California day in the Bay Area, one where the city is doing its most flattering impression of itself — cold sun, hard shadows, and that particular Friday energy where everyone’s late and somehow still stopping for a matcha. I’m in the passenger seat of a brand new Mercedes-Benz CLA sedan zipping around the crowded, two-lane streets of the city when an everyday urban traffic trap unfolds: A delivery van in front of me has stopped to unload, there’s a bus coming quickly in the other direction, and pedestrians are zipping across the street as if crosswalks are just a mere suggestion. The driver, Lucas, doesn’t have his hands on the wheel as the car glides forward. And I have exactly zero worries about how things will play out — because I’m inside Nvidia’s autonomous vehicle pilot, and this car makes better decisions than I can.
It has to. It’s trained to.
The car starts by using its camera and radar sensors to read the obvious tell: hazard lights. It doesn’t immediately commit to swinging around the van, though. It verifies that the vehicle is truly stopped, gauges the bus’ speed and distance, waits out the pedestrian flow, and then nudges left with the smoothness of someone who has done this a thousand times before — but without the ego of someone who has done this a thousand times before. (The best self-driving flex is “no drama.”)
“That was pretty well-handled,” Ali Kani, Nvidia’s VP of the automotive…