11 Sources
11 Sources
[1]
I tested Nvidia's Tesla Full Self-Driving competitor -- Tesla should be worried
It's a beautiful, cloudless day in San Francisco, and I'm sitting in the passenger seat of a Mercedes-Benz CLA sedan. The driver, Lucas, has his hands on the steering wheel, but it's really just for show: the car is essentially driving itself. The vehicle is using Nvidia's new point-to-point Level 2 (L2) driver-assist system that is getting ready to roll out to more automakers in 2026. This is the chipmaker's big bet on driving automation, one it thinks can help grow its tiny automotive business into something more substantial and more profitable. Think of it as Nvidia's answer to Tesla's Full Self-Driving. For roughly 40 minutes, we navigate a typically chaotic day in San Francisco, passing delivery trucks, cyclists, pedestrians, and even the occasional Waymo robotaxi. The Mercedes, under guidance from Nvidia's AI-powered system as well as its own built-in cameras and radar, handles itself confidently: traffic signals, four-way stops, double-parked cars, and even the occasional unprotected left. At one point, it makes a wide right turn to avoid a truck that's blocking an intersection, but not before allowing a few slowly moving pedestrians to cross in front. Tesla fans would likely scoff at Nvidia's demonstration, arguing that Full Self-Driving is orders of magnitude more capable. Nvidia hasn't been working on this problem as long as Elon Musk's company, but what they showed me absolutely would go toe-to-toe with FSD under the most complex circumstances. And thanks to the redundancy provided by Mercedes' radar, some could argue it's safer and more robust than the camera-only FSD. But perhaps a race between two companies is the wrong frame. After all, Tesla is one of Nvidia's biggest customers, using tens of thousands of the company's GPUs to train its AI models, representing billions of dollars in AI infrastructure. So even if Tesla wins, Nvidia, in a sense, wins too. The invitation to test out Nvidia's new system came a bit as a surprise. After all, the company isn't exactly known as a self-driving leader. And while Nvidia has long supplied major automakers with chips and software for driver-assist systems, its automotive business is still relatively tiny compared to the billions it rakes in on AI. It's third quarter revenues were $51.2 billion, but its automotive division only made $592 million, or 1.2 percent of the total haul. That could change soon, as Nvidia seeks to challenge Tesla and Waymo in the race to Level 4 (L4) autonomy -- cars that can fully drive themselves under specific conditions. Nvidia has invested billions of dollars over more than a decade to build a full-stack solution, says Xinzhou Wu, the head of the company's automotive division. This includes system-on-chip (SoC) hardware along with operating systems, software, and silicon. And Wu says that Nvidia is keeping safety at the forefront, claiming to be one of the few companies that meets high automotive safety requirements at both the silicon and the software levels. That includes the company's Drive AGX system-on-a-chip (SoC), similar to Tesla's Full Self-Driving chip or Intel's Mobileye EyeQ. The SoC runs the safety-certified DriveOS operating system, built on the Blackwell GPU architecture that's capable of delivering 1,000 trillions of operations per second (TOPS) of high-performance compute, the company says. "Jensen always says, the mission for me and for my team is really to make everything that moves autonomous," Wu says. Wu outlines a roadmap in which Nvidia will release Level 2 highway and urban driving capabilities, including automated lane changes, and stop sign and traffic signal recognition, in the first half of 2026. This includes an L2++ system, in which the vehicle will be able to navigate point-to-point autonomously under driver supervision. In the second half of the year, urban capabilities will expand to include autonomous parking. And by the end of the year, Nvidia's L2++ system will encompass the entirety of the United States, Wu said. For L2 and L3 vehicles, Nvidia plans on using its Drive AGX Orin-based SoC. For fully autonomous L4 vehicles, the company will transition to the new Thor generation. Software redundancy becomes critical at this level, so the architecture will use two electronic control units (ECUs): a main ECU and a separate redundant ECU. A "small scale" Level 4 trial, similar to Waymo's robotaxis, is also planned for 2026, followed by partner-based robotaxi deployments in 2027, Wu says. And by 2028, Nvidia predicts its self-driving tech will be in personally owned autonomous vehicles. Also in 2028, Nvidia plans on supplying systems that can enable Level 3 highway driving, in which drivers can take their hands off the wheel and eyes off the road under certain conditions. (Safety experts are highly skeptical about L3 systems.) Ambitious stuff, to say least. And some of it will obviously be dictated by Nvidia's automotive partners, including Mercedes, Jaguar Land Rover, and Lucid Motors, and whether or not they have the necessary confidence (and legal certainty) to include the tech in cars they sell to their customers. A bad crash, or even an ambiguous one where the tech could have been at fault, could jeopardize Nvidia's ambitions to become a Tier 1 supplier to the global auto industry. Fortunately, there were no crashes nor really any hiccups during my experience with Nvidia's point-to-point system. To be sure, I wasn't in the driver seat, so I didn't get to test it on my own terms. That will be up to the automakers, who get to decide when to release Nvidia's tech and in what models. Responsibility for when hands-free driving is allowed ultimately lies with the OEM, Ali Kani, VP and general manager of the automotive team at Nvidia, tells me. The company designed its autonomous features to be highly customizable, allowing automakers to define parameters such as acceleration, deceleration, lane-change timing, and aggressiveness. This flexibility allows each OEM to express its own "driving personality," Kani said, making the system feel like a Mercedes, for example, rather than a generic autonomous driver. To that end, Mercedes is adopting something they're calling "cooperative steering," which allows the driver to make steering adjustments without disengaging the L2 driver-assist system. This can be useful for avoiding potholes that the system does not consider obstacles. The driver can also tap the accelerator to start moving or slightly increase the speed, all without disengaging the system. Kani emphasizes Nvidia's not trying to solve driving for everyone. Instead, the goal is availability: those who want this partially automated system can have it and those who don't can simply opt out. The system is based on reinforcement learning, meaning it can continue to improve over time as it gains more experience, Kani says. When asked how it compares to Tesla's Full Self-Driving, he says its very close. In head-to-head city driving tests over long routes, the number of driver takeovers for Nvidia's system is comparable, sometimes favoring one system, sometimes the other. What makes this particularly notable is how quickly progress has been made. Tesla took roughly eight years to enable urban driving with FSD, whereas Nvidia is expected to do the same within about a year. No other passenger-car system besides Tesla's has achieved this, Kani boasts. "We're coming fast," he says, as the Mercedes slows itself down at another intersection. "I'd say [we're] very close [to FSD]."
[2]
Nvidia plans to test a robotaxi service in 2027 in self-driving push
Nvidia is building out an automotive tech business. Pictured here are its autonomous vehicle test cars at the company's auto garage in Santa Clara, California, on June 5, 2023. Nvidia on Monday unveiled plans to test a robotaxi service with a partner as soon as 2027, highlighting the chipmaker's ambition to become a major player in the world of self-driving cars. The service would be offered with a partner, and would employ cars with "Level 4" driving, meaning they will be capable of driving without human intervention in pre-defined regions, Nvidia officials said at a self-driving demonstration in San Francisco last month. The company declined to name where it would operate and who its partner will be. "We will probably start with a limited availability but work with the partner for us to get our footing," Xinzhou Wu, Nvidia's vice president of automotive, said at the event. Since 2015, Nvidia has offered chips and other technology for cars under the brand name Drive, but that remains a small part of the company's business. Automotive and robotics chips accounted for just $592 million in sales in the quarter ended in October, or about 1% of Nvidia's total revenue. Nvidia announced a robotaxi partnership with Uber in October. The chipmaker said in December that it had developed software that can power a self-driving car, and that Mercedes-Benz models to be released in late 2026 will be able to use Nvidia's technology to navigate cities like San Francisco. Self-driving cars remain one of the primary areas where Nvidia can show growth outside of AI infrastructure. CEO Jensen Huang has said that robotics -- including self-driving cars -- is the company's second most important growth category after artificial intelligence. "We imagine that someday, a billion cars on the road will all be autonomous," Huang said at a launch event on Monday at the CES conference in Las Vegas. "You could either have it be a robotaxi that you're orchestrating and renting from somebody, or you could own it." In addition to chips that go inside self-driving cars, Nvidia sells access to its famed AI chips as well as its simulation software to automotive companies so they can train self-driving models and develop technology. Nvidia says that car makers can use its Drive AGX Thor automotive computer, which costs about $3,500 per chip, to save on research and development costs, and get self-driving features to market faster. Nvidia said it works with car makers to tune its technology, such as determining how hard the car should accelerate, for specific vehicles. "Some say, 'Hey, I need your help on training and optimizing my software on your chip, but I'll take care of simulation myself,'" said Ali Kani, general manager of Nvidia's automotive platform. Car companies, like Mercedes-Benz, want to tune Nvidia's technology, market it as part of its in-car experience, and sell it as part of or alongside a new car. Robotaxis have caught on in the past year, led by Alphabet's Waymo, which is operating a commercial taxi service without drivers in five U.S. markets, including San Francisco. Nvidia's robotaxi announcement signals that it is targeting self-driving fleets in addition to personal cars that consumers could buy.
[3]
Nvidia unveils 'reasoning' AI technology for self-driving cars
The vehicle will be released in the US in the coming months before being rolled out in Europe and Asia. Wearing his trademark black leather jacket, Huang told an audience of hundreds that the project has taught Nvidia "an enormous amount" about how to help partners build robotic systems. Analysts say the announcement reinforces Nvidia's leadership in integrating AI hardware and software, deepening its push into physical AI. "NVIDIA's pivot toward AI at scale and AI systems as differentiators will help keep it way ahead of rivals," said Paolo Pescatore, analyst at PP Foresight, from Las Vegas. "Alpamayo represents a profound shift for NVIDIA, moving from being primarily a compute to a platform provider for physical AI ecosystems." Shares of the AI chip designer rose slightly in after-hours trading following Huang's presentation. It featured a video demonstration of the AI-powered Mercedes-Benz driving through San Francisco while a passenger, sat behind the steering wheel, kept their hands in their lap. "It drives so naturally because it learned directly from human demonstrators," Huang said, "but in every single scenario, when it comes up to the scenario, it reasons tells you what it's going to do, and it reasons about what it's about to do." Alpamayo is an open-source AI model, with the underlying code now available on machine learning platform Hugging Face, where autonomous vehicle researchers can access it for free and retrain the model, Huang said. "Our vision is that someday, every single car, every single truck, will be autonomous," he told the audience. The company also has plans to launch a robotaxi service by next year in collaboration with a partner, but has declined to name the partner or say where it will be.
[4]
Nvidia is putting its AI muscle behind autonomous vehicles
Nvidia is everywhere. Its logo has become the watermark of modern computing. It sits under AI-era compute, cloud infrastructure, PC graphics, and the industrial automation that's quietly remaking factories. So why wouldn't it want to take the wheel in the global autonomous vehicle push, too? But Nvidia isn't really chasing "self-driving cars." That's not a big enough dream, and it's already late to the game. Nvidia is chasing autonomy as an industrial stack -- a set of chips, models, simulation tools, and safety systems that other companies can plug into, iterate on, and ship without betting their entire company on a robotaxi moonshot. That framing sounds semantic until you sit in one of Nvidia's demo vehicles and listen to its automotive leadership talk like they're building a supply chain, not a sci-fi future. Now, Nvidia's first big "in cars" proof point has a pilot: The company's Drive AV software is launching in the all-new Mercedes-Benz CLA, with an enhanced Level 2 (or, L2++) point-to-point driver-assistance system that Nvidia expects will hit U.S. roads by the end of the first quarter; European roads by the second; and Asian roads after that. At a recent San Francisco ride-along tied to the Mercedes-Benz demo, Nvidia executives framed the company's automotive push as a full-stack effort: the in-car computer, the models, the validation loop, and the industrial-scale training infrastructure behind it. Xinzhou Wu, who leads Nvidia's automotive group, said the partnership has been over four years in the running, a full-stack and multiyear proof point for that mass-market approach. Under the hood -- literally -- Wu described what Nvidia is running in the demo vehicles: a "hybrid stack" that pairs an end-to-end model with a more traditional, "classical" stack that's already deployed in Mercedes vehicles in Europe. The coupling is the point: The end-to-end model drives a more humanlike feel; the classical stack provides an interpretable backstop, with a "safety monitor" choosing the safer trajectory. Physical AI, Wu said, is "a deep problem to solve for the next decade." That frame positions Nvidia as the one vendor already built to supply all three layers -- vehicle compute, data-center compute, simulation -- and it reframes the autonomy arms race as a compute and tooling problem more than a map-and-sensors problem. In that worldview, the winners are the companies that can compress the loop: train faster, simulate more edge cases, deploy more safely, and keep the system improving without breaking the trust contract with drivers. CEO Jensen Huang tells Xu that his job is "to make everything that moves autonomous." Here's a start. In a separate briefing, Kani zoomed out to full Nvidia-prophetic. "Everything that moves will ultimately be fully autonomous," he said. In the same breath, he frames the era shift toward "physical AI," with autonomous vehicles as one of the three categories of "robots" that matter. The rhetoric is big -- maybe too big in an era where people still don't entirely trust this technology -- but it fits the underlying strategy: autonomy becomes another domain where Nvidia can supply... everything. The near-term version, though, stays grounded in passenger-car reality: driver assistance, careful safety layering, and a partnership model designed to scale. The company isn't trying to wrestle control of the wheel, rather building the system that makes more of the drive feel automatic -- and then building the compute loop that enables that capability to improve. The global AV push has plenty of contenders chasing the spotlight. Nvidia's move is quieter, more infrastructural, and therefore perhaps more dangerous in the way that platform plays tend to be. Alpamayo is billed as a family of open-source models, simulation tools, and datasets aimed at "safe, reasoning-based" autonomous driving -- specifically the ugly, rare "long tail" cases that still break systems in ways that never show up in a glossy demo loop. Nvidia's pitch is that traditional autonomy architectures split perception and planning in ways that can limit scalability when something weird happens; end-to-end learning has moved the ball, but edge cases still demand models that can reason about cause and effect when the situation falls outside training experience. Alpamayo introduces "chain-of-thought, reasoning-based" vision-language-action (VLA) models designed to think through novel scenarios step by step, with improved explainability "critical to scaling trust and safety," according to the company, and the whole effort is "underpinned by the NVIDIA Halos safety system." CEO Jensen Huang is leaning all the way into the moment-language, calling it "the ChatGPT moment for physical AI," and arguing that Alpamayo can help vehicles "think through rare scenarios" and "explain their driving decisions," as "the foundation for safe, scalable autonomy." Nvidia says Alpamayo models are teacher models, "rather than running directly in-vehicle," meant to be fine-tuned and distilled into the runtime backbones of complete AV stacks. That makes Alpamayo feel less like "new feature coming to your Mercedes" and more like Nvidia trying to standardize the way autonomy gets built -- the same platform instinct, aimed at the training/simulation layer where Nvidia already knows how to win.
[5]
Nvidia unveils open-source AI for autonomous driving, ships in Mercedes-Benz CLA in Q1 2026
Nvidia (NVDA) held its CES 2026 keynote today, and as expected, Jensen Huang dropped a massive amount of news on the autonomous driving front. The biggest takeaway? Nvidia is moving beyond just "perceiving" the road to "reasoning" about it with a new family of open-source models called Alpamayo, which will power new autonomous and driver-assistance features. Starting with Mercedes-Benz as soon as this quarter. Here's the breakdown of everything Nvidia announced for self-driving technology today. Nvidia is calling this the "ChatGPT moment for physical AI." The company unveiled Alpamayo, a family of open-source AI models designed to solve the "long tail" problem of autonomous driving, those rare, weird edge cases that usually cause self-driving stacks to disengage or fail. The flagship is Alpamayo 1, a 10-billion-parameter Vision-Language-Action (VLA) model. Unlike traditional AV stacks that just detect objects and plan a path, Alpamayo uses "chain-of-thought" reasoning. It processes video input and generates a trajectory, but crucially, it also outputs the logic behind its decision. Jensen Huang explained that the model can "think through rare scenarios" and explain its driving decisions. To sweeten the deal for developers, Nvidia is going the open-source route. They are releasing: This is a clear play to become the default "Android of Autonomy" while Tesla continues to keep its Full Self-Driving (FSD) stack completely closed. We've been hearing about the Nvidia-Mercedes partnership for years, but today we got a concrete timeline. Huang confirmed that the 2025 Mercedes-Benz CLA will be the first production vehicle to ship with Nvidia's entire AV stack, including the new Alpamayo reasoning capabilities. While it's officially launching as a "Level 2+" system, much like Tesla's 'Full Self-Driving', which in reality is a level 2 driver assistance system as it requires attention from the driver at all times, it appears that the goal is to push toward level 4 capabilities. Here's how Mercedes describes the system right now: With Mercedes-Benz's MB.DRIVE ASSIST PRO, driving assistance and navigation merge to create a completely new and safe driving experience. At the press of a button, the vehicle can help navigate through the city streets - from the parking lot to the destination - with advanced SAE-Level 2 assistance. Thanks to Mercedes-Benz's cooperative steering approach, steering adaptions are possible at any time without deactivating the system. The sensor stack consists of 30 sensors, including 10 cameras, 5 radar sensors and 12 ultrasonic sensors. Powering all this backend training and simulation is Nvidia's new Vera Rubin platform, the successor to Blackwell. It's a six-chip AI platform that Nvidia claims is now in full production. While much of this is data-center focused, the "Rubin" GPUs and "Vera" CPUs are what will likely be training the next iterations of Alpamayo that end up in your car. This is a very interesting move from Nvidia. The fact that Alpamayo outputs a "reasoning trace" is huge for regulators who are terrified of black-box AI models crashing cars without us knowing why. The open-source aspect is also brilliant. By giving away the model and the simulator, Nvidia ensures that startups and other automakers get hooked on their CUDA ecosystem. If you can't build an autonomous system by yourself (which, let's be honest, most legacy automakers can't), you now just grab Alpamayo and run it on Nvidia chips. As for the Mercedes CLA, "Level 2+" that feels like plans to deliver something like Tesla has with FSD without the promise of unsupervised self-driving, something Tesla has consistently failed to deliver despite selling it to its customers since 2016. If Mercedes actually ships a car in Q1 that can have similar capabilities as Tesla's FSD, and it is based on an open-sourced system that any automaker can buy, it could shake up the industry and start to commoditize this idea of "level 2+" autonomous systems.
[6]
Jensen Huang Says Tesla FSD Is '100% State-Of-The-Art,' Explains How Nvidia's Alpamayo Is Different - NVIDIA (NASDAQ:NVDA)
Nvidia Corp (NASDAQ:NVDA) CEO Jensen Huang has hailed the progress made by Tesla Inc.'s (NASDAQ:TSLA) Full Self-Driving (FSD) system, saying that the Elon Musk-led EV giant's approach was "hard to criticize." FSD Is State-Of-The-Art, Says Jensen Huang Huang, in an interview with Bloomberg's Ed Ludlow on Tuesday, was asked about the differences between Nvidia's approach to self-driving in comparison with Tesla's FSD. Huang shared that the outlier between Alpamayo and Tesla's FSD was the former's combination of vision, radar, and LiDAR tech. However, he then added that despite its combination of sensors and cameras, the technology was "rather similar" to Tesla's. "I think I think Elon's approach is about state of the art, as anybody knows, of autonomous driving robotics," Huang said. The Nvidia chief also said that he would "encourage" Tesla in its self-driving pursuits. Musk Hails Tesla's AI4 Chip The comments coincide with Musk hailing the automaker's AI4 chip, sharing that Tesla would be spending double the planned $10 billion amount it intends to spend on training Nvidia hardware had it not been for its own in-house AI chipset. He also lamented the auto industry's lack of investment in AI and autonomous driving technology. Interestingly, Tesla rival Lucid Group Inc. (NASDAQ:LCID) recently showcased its robotaxi prototype. Musk Wishes Success For Nvidia Musk, following the reveal of Nvidia's self-driving technology, shared that the distribution for the technology could pose a challenge for the chipmaker in its autonomous goal, but ultimately wished the company success in its approach to self-driving technology. The billionaire also revealed that it could take some time for the technology to pose a competitive challenge to Tesla's FSD system, sharing that the timeframe could stretch to 5 or 6 years. Nvidia's xAI Investment On the other hand, Nvidia is a "strategic investor" in Musk's artificial intelligence enterprise, xAI, as the company announced it raised over $20 billion in Series E funding. Other investors in xAI include the likes of Cisco Investments (NASDAQ:CSCO), the Qatar Investment Authority, as well as Abu Dhabi's MGX. Tesla scores well on the Momentum and Quality metrics, but offers poor Value. It also has a favorable price trend in the Short, Medium, and Long term. For more such insights, sign up for Benzinga Edge Stock Rankings today! Price Action: TSLA declined 4.14% to $432.96 at market close, but it gained 0.46% during the after-hours session, according to Benzinga Pro data. Check out more of Benzinga's Future Of Mobility coverage by following this link. Photo courtesy: Shutterstock NVDANVIDIA Corp$188.160.49%OverviewCSCOCisco Systems Inc$75.280.07%LCIDLucid Group Inc$11.700.86%TSLATesla Inc$434.950.46%Market News and Data brought to you by Benzinga APIs
[7]
Elon Musk Says Tesla's Planned $10B Spend On Nvidia Hardware Would Be Double If Not For This Reason, Laments Lack Of AI Investments In Auto Industry - Tesla (NASDAQ:TSLA)
Tesla Inc. (NASDAQ:TSLA) CEO Elon Musk stated on Monday that the EV maker would have had to spend twice as much on buying AI hardware from Nvidia Corp. (NASDAQ:NVDA) to train its systems if it weren't for the company's AI4 chipset. Musk Touts $10 Billion Nvidia Spending Musk also shared that Tesla was making close to two million cars annually and growing, all of which boast Tesla's "dual SoC AI4, 8 cameras, redundancy in steering actuation and others systems, high bandwidth communication, etc." "By the end of this year, Tesla will have spent ~$10B cumulatively just on Nvidia hardware for training," the billionaire said. Musk also added that while Nvidia was providing "helpful tools" to automakers, the industry as a whole is "doing very little" to develop self-driving technology. Not A Competition For Tesla On Tuesday, user @jamesdouma shared his thoughts on Nvidia's technology and whether it could pose a challenge to Tesla's Full Self-Driving (FSD) system, stating that it isn't possible and providing an in-depth analysis of the technology. "There is no scenario in which a company building on top of this new development kit will even slightly dent Tesla's Robotaxi market opportunity," the user said. Musk On Nvidia's Alpamayo Musk had earlier shared that the Alpamayo system could become competitive with Tesla's FSD in the next 5 or 6 years, but that number could stretch further, according to the billionaire. He also shared that the "actual time from when FSD sort of works to where it is much safer than a human is several years." The Tesla CEO previously stated that the technology could face challenges with distribution, which was also seconded by Tesla's AI Chief, Ashok Elluswamy. However, Musk ultimately wished success for Nvidia's self-driving efforts. Tesla scores well on the Momentum and Quality metrics, but offers poor Value. It also has a favorable price trend in the Short, Medium, and Long term. For more such insights, sign up for Benzinga Edge Stock Rankings today! Price Action: TSLA declined 4.14% to $432.96 at market close, but it gained 0.46% during the after-hours session, according to Benzinga Pro data. Check out more of Benzinga's Future Of Mobility coverage by following this link. Photo courtesy: Shutterstock TSLATesla Inc$434.950.46%OverviewNVDANVIDIA Corp$188.160.49%Market News and Data brought to you by Benzinga APIs
[8]
Nvidia Takes On Robotaxis With Chips, Software And Big Ambitions - NVIDIA (NASDAQ:NVDA)
Nvidia Corp. (NASDAQ:NVDA) is intensifying its push into autonomous driving, signaling ambitions to play a central role in the robotaxi market and directly challenge Tesla Inc.'s (NASDAQ:TSLA) long-term autonomy strategy. Nvidia Targets Level 4 Robotaxis by 2027 At CES 2026 on Monday, Nvidia said it is working with robotaxi operators to deploy fleets powered by its AI chips and Drive AV software as early as 2027. Xinzhou Wu, Nvidia's vice president of automotive, said the company expects to enable "Level 4" robotaxis by that time, vehicles capable of operating without human intervention in defined areas. While Nvidia has offered automotive solutions since 2015, the segment remains a relatively small part of its business, CNBC reported. Still, CEO Jensen Huang has identified robotics and self-driving technology as Nvidia's next major growth engine after AI infrastructure. Also Read: Nvidia Tops $4.5 Trillion as AI Boom Expands Beyond Data Centers Into Robots and Machines 'ChatGPT Moment' for Physical AI To accelerate adoption, Nvidia is introducing new hardware and software aimed at simplifying autonomous system development. Its Drive AGX Thor automotive computer, priced at roughly $3,500 per chip, is designed to lower research costs and shorten development timelines for automakers. The company also unveiled Alpamayo, an open-source family of Vision Language Action models that integrate perception, reasoning, and action into a single system. Mobility leaders and industry experts, including Lucid Group, Inc. (NASDAQ:LCID), JLR, Uber Technologies, Inc. (NYSE:UBER) and Berkeley DeepDrive, are showing interest in Alpamayo to develop reasoning-based AV stacks that will enable level 4 autonomy. Huang said the industry has reached what he described as the "ChatGPT moment for physical AI," marking a turning point where machines can understand, reason, and act in real-world environments. He noted that robotaxis are among the earliest beneficiaries of this shift, as Nvidia's Alpamayo platform enables autonomous vehicles to reason through rare edge cases, operate safely in complex settings, and explain their driving decisions, capabilities he framed as essential to building safe, scalable autonomy. Pressure Builds on Tesla Nvidia's robotaxi strategy also targets fleet-based competitors such as Alphabet Inc.'s (NASDAQ:GOOGL) Waymo. The company recently demonstrated its system in a 2026 Mercedes-Benz CLA driving through San Francisco, currently classified by Nvidia as "Level 2 Plus Plus," comparable to Tesla's Full Self-Driving. Nvidia aims to reach point-to-point self-driving for consumer vehicles by 2028. Meanwhile, Tesla faces mounting scrutiny. Ross Gerber of Gerber Kawasaki said Tesla will not achieve Level 4 or Level 5 autonomy without addressing hardware limitations, not just software. Tesla has also revised its Full Self-Driving disclosures, clarifying that the system does not provide autonomous capability and requires constant human supervision. NVDA Price Action: Nvidia shares were up 0.49% at $189.04 during premarket trading on Tuesday, according to Benzinga Pro data. Read Next: China's AI Appetite Pushes Nvidia Into A Supply Crunch Photo by jamesonwu1972 via Shutterstock NVDANVIDIA Corp$189.100.52%OverviewGOOGLAlphabet Inc$316.42-0.04%LCIDLucid Group Inc$11.800.94%TSLATesla Inc$451.52-0.03%UBERUber Technologies Inc$81.901.44%Market News and Data brought to you by Benzinga APIs
[9]
Elon Musk Says Nvidia's Alpamayo Technology Could Be 'Competitive' With Tesla's FSD In Next '5 Or 6 Years' - Tesla (NASDAQ:TSLA)
EV giant Tesla Inc.'s (NASDAQ:TSLA) CEO Elon Musk shared that Nvidia Corp's (NASDAQ:NVDA) autonomous driving technology could be a competitor to Tesla's Full Self-Driving (FSD) technology in the future. Tesla FSD Vs Alpamayo Following the unveiling of Nvidia's Alpamayo technology at the Consumer Electronics Show (CES) 2026, influencer Teslaconomics compared the technology with Tesla's FSD. "This looks like what Tesla has already been doing for years..." the influencer said in a post on X, adding that Tesla was instead training on "massive real world data from actual customers and a live fleet." The user added that the neural network in Tesla's system "continuously learns to chase that final 99.99999X% of safety." Competitive In 5 Or 6 Years, Says Elon Musk Musk responded that he agreed with the poster's views. "The actual time from when FSD sort of works to where it is much safer than a human is several years," the CEO said. He added that legacy automakers were still lagging when it comes to applying cameras and AI computers to their vehicles. "So this is maybe a competitive pressure on Tesla in 5 or 6 years, but probably longer," Musk shared. Musk On Nvidia's Self-Driving Technology, New Chip The comments come as Nvidia also showcased its new self-driving technology. Musk said that the technology's distribution would be "super hard" to solve for Nvidia. His comments were also echoed by Tesla's AI Chief, Ashok Elluswamy. Musk also said that despite the impressive technology with the new Vera Rubin chipsets, it would take "another 9 months" before the technology becomes "operational at scale" and the software can function smoothly. Tesla scores well on the Momentum and Quality metrics, but offers poor Value. It also has a favorable price trend in the Medium and Long term. For more such insights, sign up for Benzinga Edge Stock Rankings today! Price Action: TSLA declined 0.07% to $451.37 during the pre-market trading session, according to Benzinga Pro data. Check out more of Benzinga's Future Of Mobility coverage by following this link. Read Next: Tesla Rival Lucid Unveils Prototype Robotaxi In Collaboration With Uber Amid Strong Q4 Deliveries Photo courtesy: Shutterstock TSLATesla Inc$451.30-0.08%OverviewNVDANVIDIA Corp$189.200.57%Market News and Data brought to you by Benzinga APIs
[10]
Elon Musk Says Nvidia's Next-Gen Rubin Chips Won't Be Operational At Scale Soon As Tesla Works On In-House AI Hardware - Tesla (NASDAQ:TSLA)
Tesla Inc. (NASDAQ:TSLA) CEO Elon Musk says that Nvidia Corp's (NASDAQ:NVDA) new chipset would take some time before it can be scaled. Nvidia's Rubin Chip On Monday, influencer Sawyer Merritt shared a video that detailed Nvidia's new Vera Rubin chips' capabilities, showcased at the Consumer Electronics Show (CES) 2026. The new architecture has no fans or cables and promises up to five times more powerful performance than the preceding Blackwell chip and each rack would contain 72 Rubin chips. The chips, according to Nvidia CEO Jensen Huang, were two GPUs connected together. Nvidia Would Need 9 Months, Says Elon Musk The SpaceX CEO shared his thoughts in a response to Merritt's post. He shared that despite the impressive technology, it would take "another 9 months" before the technology becomes "operational at scale" and the software can function smoothly. Elon Musk On Nvidia's Self-Driving Exploits Nvidia also showcased its new self-driving technology, Alpamayo, at the event, which touted a vision-language-action approach and was hailed by Huang as the "ChatGPT moment" for self-driving. Musk shared that the technology's distribution would be "super hard" to solve for Nvidia. His comments were echoed by Tesla's AI Chief, Ashok Elluswamy, too. However, the Tesla CEO ultimately wished Nvidia the best and hoped that it succeeded in developing self-driving car tech. Tesla's In-House Chips Meanwhile, Tesla is also working on developing AI chips. Musk had said that the developments would enable Tesla to produce more chipsets than any of its competitors. He had also announced a major hiring drive at the company to help in its chip-building efforts. Tesla scores well on the Momentum and Quality metrics, but offers poor Value. It also has a favorable price trend in the Medium and Long term. For more such insights, sign up for Benzinga Edge Stock Rankings today! Price Action: TSLA declined 0.14% to $451.05 during the after-hours trading session, according to Benzinga Pro data. Check out more of Benzinga's Future Of Mobility coverage by following this link. Read Next: Tesla Rival Lucid Unveils Prototype Robotaxi In Collaboration With Uber Amid Strong Q4 Deliveries TSLATesla Inc$451.05-0.14%OverviewNVDANVIDIA Corp$187.99-0.07%Market News and Data brought to you by Benzinga APIs
[11]
Elon Musk Says It Is 'Super Hard' To Solve Distribution As Nvidia Announces Competitor To Tesla FSD, But Adds 'Honestly Hope They Succeed' - Tesla (NASDAQ:TSLA)
Tesla Inc. (NASDAQ:TSLA) CEO Elon Musk says that the most puzzling aspect of self-driving technology is to solve distribution, as Nvidia Corp (NASDAQ:NVDA) unveiled autonomous driving technology at the Consumer Electronics Show (CES) 2026. Check out the current price of TSLA here. Distribution Is 'Super Hard' To Solve, Says Elon Musk Responding to a post on the social media platform X on Monday, Musk shared his thoughts on Nvidia's unveiling. "What they will find is that it's easy to get to 99% and then super hard to solve the long tail of the distribution," Musk said in the post. However, the Tesla CEO still wished that Nvidia experiences success in its self-driving exploits. "I honestly hope they succeed," Musk shared. Ashok Elluswamy Echoes Musk Musk's thoughts were also echoed by Tesla's AI chief, Ashok Elluswamy, who also outlined challenges in the distribution of the technology. "The long tail is sooo long, that most people can't grasp it," Elluswamy said in his post. He had earlier touted Tesla's chip-building efforts, outlining the EV giant's synergistic approach to building AI hardware. Nvidia's Self-Driving Technology The comments come as Nvidia CEO Jensen Huang unveiled the chipmaker's Alpamayo model, which relies on a Vision-Language-Action (VLA) model with human-like reasoning capabilities. "The ChatGPT moment for physical AI is here -- when machines begin to understand, reason and act in the real world," Huang said, adding that Robotaxis would benefit from the technology. Nvidia also announced it would be partnering with German automaker Mercedes-Benz to offer a Level 2 Driver Assistance system based on the chipmaker's full-stack AV software. This comes after the chipmaker had partnered with European automakers last year. Tesla's Robotaxi On the other hand, Tesla has steadily recorded growth in its Robotaxi, though it failed to reach Musk's goal of having driverless operations in Austin by the end of last year. However, the CEO had shared that a Tesla Model Y robotaxi drove him around autonomously towards the end of last year. Multiple Tesla Cybercab prototypes have also been spotted testing across Austin and California. Musk had shared that the production of the vehicle would be scaled up in 2026. Tesla scores well on the Momentum and Quality metrics, but offers poor Value. It also has a favorable price trend in the Medium and Long term. For more such insights, sign up for Benzinga Edge Stock Rankings today! Price Action: TSLA declined 0.14% to $451.05 during the after-hours trading session, according to Benzinga Pro data. Check out more of Benzinga's Future Of Mobility coverage by following this link. Read Next: Gary Black Reveals Why Tesla Could Fall Behind Robotaxi Rivals: 'An Army Of Unknown Cheerleaders...' Photo courtesy: Shutterstock TSLATesla Inc$451.05-0.14%OverviewNVDANVIDIA Corp$187.99-0.07%Market News and Data brought to you by Benzinga APIs
Share
Share
Copy Link
Nvidia launched Alpamayo, a family of open-source AI models designed to solve the rare edge cases that plague self-driving cars. The technology debuts in the Mercedes-Benz CLA in Q1 2026, with a Level 2+ driver-assist system that CEO Jensen Huang calls the 'ChatGPT moment for physical AI.' The chipmaker also plans to test a robotaxi service with a partner by 2027, marking its aggressive push into autonomous vehicles despite automotive revenue representing just 1.2 percent of its total business.
Nvidia has unveiled Alpamayo, a family of open-source AI models that represents the chipmaker's most aggressive push yet into autonomous driving. CEO Jensen Huang called it the "ChatGPT moment for physical AI," positioning the technology as a direct competitor to Tesla Full Self-Driving
3
. The flagship Alpamayo 1 is a 10-billion-parameter Vision-Language-Action (VLA) model that uses chain-of-thought reasoning to handle the rare edge cases that typically cause self-driving systems to fail5
. Unlike traditional autonomous vehicle stacks that simply detect objects and plan routes, Alpamayo processes video input, generates trajectories, and crucially outputs the reasoning trace behind each decision4
.
Source: Electrek
The open-source approach marks a sharp contrast to Tesla's closed ecosystem. Nvidia released the models, simulation tools, and datasets on machine learning platform Hugging Face, allowing autonomous vehicle researchers to access and retrain the technology for free
3
. This strategy positions Nvidia as the "Android of Autonomy" while Tesla keeps its Full Self-Driving stack proprietary5
.The 2025 Mercedes-Benz CLA will be the first production vehicle to ship with Nvidia's complete autonomous driving stack, including Alpamayo reasoning capabilities, launching in the United States by the end of Q1 2026
5
. European deployment follows in Q2, with Asian markets coming later4
. The vehicle features a Level 2+ point-to-point driver-assist system that operates under driver supervision, equipped with 30 sensors including 10 cameras, 5 radar sensors, and 12 ultrasonic sensors5
.Xinzhou Wu, who leads Nvidia's automotive division, explained that the partnership has been over four years in development and runs a "hybrid stack" that pairs an end-to-end model with a classical stack already deployed in Mercedes vehicles in Europe
4
. The end-to-end model delivers more humanlike driving behavior, while the classical stack provides an interpretable safety monitor that chooses the safer trajectory4
.During a San Francisco demonstration, a Mercedes-Benz CLA using Nvidia's system navigated chaotic city streets for 40 minutes, handling traffic signals, four-way stops, double-parked cars, and unprotected left turns
1
. The vehicle even executed a wide right turn to avoid a truck blocking an intersection while allowing pedestrians to cross1
. Thanks to redundancy provided by Mercedes' radar, some argue the system is safer and more robust than Tesla's camera-only approach1
.Nvidia outlined an aggressive timeline for expanding its autonomous driving capabilities. The company plans to release Level 2 highway and urban driving features, including automated lane changes and traffic signal recognition, in the first half of 2026
1
. Urban capabilities will expand to include autonomous parking in the second half of 2026, with L2++ coverage extending across the entire United States by year's end1
.For Level 4 autonomy—self-driving cars capable of operating without human intervention in pre-defined regions—Nvidia will transition from its Drive AGX Orin-based system-on-chip to the new Thor generation
1
. The architecture uses two electronic control units for software redundancy: a main ECU and a separate redundant ECU1
.A "small scale" Level 4 trial is planned for 2026, followed by partner-based robotaxi service deployments in 2027, Wu confirmed
1
2
. The company declined to name the partner or specify where the service would operate, though Wu indicated it would start with limited availability2
. By 2028, Nvidia predicts its self-driving technology will appear in personally owned autonomous vehicles1
.Related Stories
Despite its dominance in AI chips, Nvidia's automotive business remains remarkably small. In the third quarter, automotive and robotics chips generated just $592 million in revenue—merely 1.2 percent of Nvidia's total $51.2 billion haul
1
2
. Jensen Huang has identified robotics, including self-driving cars, as the company's second most important growth category after artificial intelligence2
."Jensen always says, the mission for me and for my team is really to make everything that moves autonomous," Wu explained
1
. At CES, Huang declared, "We imagine that someday, a billion cars on the road will all be autonomous"2
.Nvidia positions physical AI as "a deep problem to solve for the next decade," framing itself as the only vendor built to supply all three critical layers: vehicle compute, data-center compute, and simulation
4
. The company's Drive AGX Thor automotive computer costs about $3,500 per chip, and Nvidia argues that automakers can use it to reduce research and development costs while getting self-driving features to market faster2
.The Drive AGX system-on-chip runs the safety-certified DriveOS operating system, built on the Blackwell GPU architecture capable of delivering 1,000 trillions of operations per second (TOPS)
1
. Powering backend training and simulation is Nvidia's new Vera Rubin platform, a six-chip AI system now in full production5
.The fact that Alpamayo outputs a reasoning trace addresses a critical concern for regulators wary of black-box AI models making life-or-death decisions on public roads
5
. Huang emphasized that the model can "think through rare scenarios" and "explain its driving decisions," with improved explainability "critical to scaling trust and safety"4
.Nvidia says it meets high automotive safety requirements at both the silicon and software levels, underpinned by the NVIDIA Halos safety system
4
. Wu claims Nvidia is one of the few companies achieving this dual safety certification1
.The open-source strategy also serves a practical purpose: by giving away the model and simulator, Nvidia ensures that startups and automakers become dependent on its CUDA ecosystem
5
. If legacy automakers struggle to build autonomous systems independently—which most do—they can simply adopt Alpamayo and run it on Nvidia chips.Analyst Paolo Pescatore from PP Foresight noted that "Alpamayo represents a profound shift for NVIDIA, moving from being primarily a compute to a platform provider for physical AI ecosystems"
3
. If Mercedes successfully ships a vehicle in Q1 with capabilities similar to Tesla's FSD based on an open-sourced system any automaker can license, it could commoditize Level 2+ autonomous systems and reshape the competitive landscape5
. Nvidia also maintains a partnership with Uber announced in October2
, and its automotive partners include Jaguar Land Rover and Lucid Motors1
.Summarized by
Navi
23 Jul 2025•Technology

07 Nov 2025•Business and Economy
29 May 2025•Technology

1
Policy and Regulation

2
Technology
3
Technology
