The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Thu, 29 Aug, 12:04 AM UTC
2 Sources
[1]
Questions about safety of Tesla's "Full Self-Driving" system are growing
Three times in the past four months, William Stein, a technology analyst at Truist Securities, has taken Elon Musk up on his invitation to try the latest versions of Tesla's vaunted "Full Self-Driving" system. A Tesla equipped with the technology, the company says, can travel from point to point with little human intervention. Yet each time Stein drove one of the cars, he said, the vehicle made unsafe or illegal maneuvers. His most recent test-drive earlier this month, Stein said, left his 16-year-old son, who accompanied him, "terrified." Stein's experiences, along with a Seattle-area Tesla crash involving Full Self-Driving that killed a motorcyclist in April, have drawn the attention of federal regulators. They have already been investigating Tesla's automated driving systems for more than two years because of dozens of crashes that raised safety concerns. The problems have led people who monitor autonomous vehicles to become more skeptical that Tesla's automated system will ever be able to operate safely on a widespread scale. Stein says he doubts Tesla is even close to deploying a fleet of autonomous robotaxis by next year as Musk has predicted it will. The latest incidents come at a pivotal time for Tesla. Musk has told investors it's possible that Full Self-Driving will be able to operate more safely than human drivers by the end of this year, if not next year. And in less than two months, the company is scheduled to unveil a vehicle built expressly to be a robotaxi. For Tesla to put robotaxis on the road, Musk has said the company will show regulators that the system can drive more safely than humans. Under federal rules, the Teslas would have to meet national standards for vehicle safety. Musk has released data showing miles driven per crash, but only for Tesla's less-sophisticated Autopilot system. Safety experts say the data is invalid because it counts only serious crashes with air bag deployment and doesn't show how often human drivers had to take over to avoid a collision. Full Self-Driving is being used on public roads by roughly 500,000 Tesla owners -- slightly more than one in five Teslas in use today. Most of them paid $8,000 or more for the optional system. The company has cautioned that cars equipped with the system cannot actually drive themselves and that motorists must be ready at all times to intervene if necessary. Tesla also says it tracks each driver's behavior and will suspend their ability to use Full Self-Driving if they don't properly monitor the system. Recently, the company began calling the system "Full Self-Driving" (Supervised). Musk, who has acknowledged that his past predictions for the use of autonomous driving proved too optimistic, in 2019 promised a fleet of autonomous vehicles by the end of 2020. Five years later, many who follow the technology say they doubt it can work across the U.S. as promised. "It's not even close, and it's not going to be next year," said Michael Brooks, executive director of the Center for Auto Safety. The car that Stein drove was a Tesla Model 3, which he picked up at a Tesla showroom in Westchester County, north of New York City. The car, Tesla's lowest-price vehicle, was equipped with the latest Full Self-Driving software. Musk says the software now uses artificial intelligence to help control steering and pedals. During his ride, Stein said, the Tesla felt smooth and more human-like than past versions did. But in a trip of less than 10 miles, he said the car made a left turn from a through lane while running a red light. "That was stunning," Stein said. He said he didn't take control of the car because there was little traffic and, at the time, the maneuver didn't seem dangerous. Later, though, the car drove down the middle of a parkway, straddling two lanes that carry traffic in the same direction. This time, Stein said, he intervened. The latest version of Full Self-Driving, Stein wrote to investors, does not "solve autonomy" as Musk has predicted. Nor does it "appear to approach robotaxi capabilities." During two earlier test drives he took, in April and July, Stein said Tesla vehicles also surprised him with unsafe moves. Tesla has not responded to messages seeking a comment. Stein said that while he thinks Tesla will eventually make money off its driving technology, he doesn't foresee a robotaxi with no driver and a passenger in the back seat in the near future. He predicted it will be significantly delayed or limited in where it can travel. There's often a significant gap, Stein pointed out, between what Musk says and what is likely to happen. To be sure, many Tesla fans have posted videos on social media showing their cars driving themselves without humans taking control. Videos, of course, don't show how the system performs over time. Others have posted videos showing dangerous behavior. Alain Kornhauser, who heads autonomous vehicle studies at Princeton University, said he drove a Tesla borrowed from a friend for two weeks and found that it consistently spotted pedestrians and detected other drivers. Yet while it performs well most of the time, Kornhauser said he had to take control when the Tesla has made moves that scared him. He warns that Full Self-Driving isn't ready to be left without human supervision in all locations. "This thing," he said, "is not at a point where it can go anywhere." Kornhauser said he does think the system could work autonomously in smaller areas of a city where detailed maps help guide the vehicles. He wonders why Musk doesn't start by offering rides on a smaller scale. "People could really use the mobility that this could provide," he said. For years, experts have warned that Tesla's system of cameras and computers isn't always able to spot objects and determine what they are. Cameras can't always see in bad weather and darkness. Most other autonomous robotaxi companies, such as Alphabet Inc.'s Waymo and General Motors' Cruise, combine cameras with radar and laser sensors. "If you can't see the world correctly, you can't plan and move and actuate to the world correctly," said Missy Cummings, a professor of engineering and computing at George Mason University. "Cars can't do it with vision only," she said. Even those with laser and radar, Cummings said, can't always drive reliably yet, raising safety questions about Waymo and Cruise. (Representatives for Waymo and Cruise declined to comment.) Phil Koopman, a professor at Carnegie Mellon University who studies autonomous vehicle safety, said it will be many years before autonomous vehicles that operate solely on artificial intelligence will be able to handle all real-world situations. "Machine learning has no common sense and learns narrowly from a huge number of examples," Koopman said. "If the computer driver gets into a situation it has not been taught about, it is prone to crashing." Last April in Snohomish County, Washington, near Seattle, a Tesla using Full Self-Driving hit and killed a motorcyclist, authorities said. The Tesla driver, who has not yet been charged, told authorities that he was using Full Self-Driving while looking at his phone when the car rear-ended the motorcyclist. The motorcyclist was pronounced dead at the scene, authorities reported. The agency said it's evaluating information on the fatal crash from Tesla and law enforcement officials. It also says it's aware of Stein's experience with Full Self-Driving. NHTSA also noted that it's investigating whether a Tesla recall earlier this year, which was intended to bolster its automated vehicle driver monitoring system, actually succeeded. It also pushed Tesla to recall Full Self-Driving in 2023 because, in "certain rare circumstances," the agency said, it can disobey some traffic laws, raising the risk of a crash. (The agency declined to say if it has finished evaluating whether the recall accomplished its mission.) As Tesla electric vehicle sales have faltered for the past several months despite price cuts, Musk has told investors that they should view the company more as a robotics and artificial intelligence business than a car company. Yet Tesla has been working on Full Self-Driving since at least 2015. "I recommend anyone who doesn't believe that Tesla will solve vehicle autonomy should not hold Tesla stock," he said during an earnings conference call last month. Stein told investors, though, they should determine for themselves whether Full Self-Driving, Tesla's artificial intelligence project "with the most history, that's generating current revenue, and is being used in the real world already, actually works."
[2]
Latest Tesla FSD test drives cast safety doubts - Fast Company
Three times in the past four months, William Stein, a technology analyst at Truist Securities, has taken Elon Musk up on his invitation to try the latest versions of Tesla's vaunted "Full Self-Driving" system. A Tesla equipped with the technology, the company says, can travel from point to point with little human intervention. Yet each time Stein drove one of the cars, he said, the vehicle made unsafe or illegal maneuvers. His most recent test-drive earlier this month, Stein said, left his 16-year-old son, who accompanied him, "terrified." Stein's experiences, along with a Seattle-area Tesla crash involving Full Self-Driving that killed a motorcyclist in April, have drawn the attention of federal regulators. They have already been investigating Tesla's automated driving systems for more than two years because of dozens of crashes that raised safety concerns. The problems have led people who monitor autonomous vehicles to become more skeptical that Tesla's automated system will ever be able to operate safely on a widespread scale. Stein says he doubts Tesla is even close to deploying a fleet of autonomous robotaxis by next year as Musk has predicted it will. The latest incidents come at a pivotal time for Tesla. Musk has told investors it's possible that Full Self-Driving will be able to operate more safely than human drivers by the end of this year, if not next year. And in less than two months, the company is scheduled to unveil a vehicle built expressly to be a robotaxi. For Tesla to put robotaxis on the road, Musk has said the company will show regulators that the system can drive more safely than humans. Under federal rules, the Teslas would have to meet national standards for vehicle safety. Musk has released data showing miles driven per crash, but only for Tesla's less-sophisticated Autopilot system. Safety experts say the data is invalid because it counts only serious crashes with air bag deployment and doesn't show how often human drivers had to take over to avoid a collision. Full Self-Driving is being used on public roads by roughly 500,000 Tesla owners -- slightly more than one in five Teslas in use today. Most of them paid $8,000 or more for the optional system. The company has cautioned that cars equipped with the system cannot actually drive themselves and that motorists must be ready at all times to intervene if necessary. Tesla also says it tracks each driver's behavior and will suspend their ability to use Full Self-Driving if they don't properly monitor the system. Recently, the company began calling the system "Full Self-Driving" (Supervised). Musk, who has acknowledged that his past predictions for the use of autonomous driving proved too optimistic, in 2019 promised a fleet of autonomous vehicles by the end of 2020. Five years later, many who follow the technology say they doubt it can work across the U.S. as promised. "It's not even close, and it's not going to be next year," said Michael Brooks, executive director of the Center for Auto Safety. The car that Stein drove was a Tesla Model 3, which he picked up at a Tesla showroom in Westchester County, north of New York City. The car, Tesla's lowest-price vehicle, was equipped with the latest Full Self-Driving software. Musk says the software now uses artificial intelligence to help control steering and pedals. During his ride, Stein said, the Tesla felt smooth and more human-like than past versions did. But in a trip of less than 10 miles, he said the car made a left turn from a through lane while running a red light. "That was stunning," Stein said. He said he didn't take control of the car because there was little traffic and, at the time, the maneuver didn't seem dangerous. Later, though, the car drove down the middle of a parkway, straddling two lanes that carry traffic in the same direction. This time, Stein said, he intervened. The latest version of Full Self-Driving, Stein wrote to investors, does not "solve autonomy" as Musk has predicted. Nor does it "appear to approach robotaxi capabilities." During two earlier test drives he took, in April and July, Stein said Tesla vehicles also surprised him with unsafe moves. Tesla has not responded to messages seeking a comment. Stein said that while he thinks Tesla will eventually make money off its driving technology, he doesn't foresee a robotaxi with no driver and a passenger in the back seat in the near future. He predicted it will be significantly delayed or limited in where it can travel. There's often a significant gap, Stein pointed out, between what Musk says and what is likely to happen. To be sure, many Tesla fans have posted videos on social media showing their cars driving themselves without humans taking control. Videos, of course, don't show how the system performs over time. Others have posted videos showing dangerous behavior. Alain Kornhauser, who heads autonomous vehicle studies at Princeton University, said he drove a Tesla borrowed from a friend for two weeks and found that it consistently spotted pedestrians and detected other drivers. Yet while it performs well most of the time, Kornhauser said he had to take control when the Tesla has made moves that scared him. He warns that Full Self-Driving isn't ready to be left without human supervision in all locations. "This thing," he said, "is not at a point where it can go anywhere." Kornhauser said he does think the system could work autonomously in smaller areas of a city where detailed maps help guide the vehicles. He wonders why Musk doesn't start by offering rides on a smaller scale. "People could really use the mobility that this could provide," he said. For years, experts have warned that Tesla's system of cameras and computers isn't always able to spot objects and determine what they are. Cameras can't always see in bad weather and darkness. Most other autonomous robotaxi companies, such as Alphabet Inc.'s Waymo and General Motors' Cruise, combine cameras with radar and laser sensors. "If you can't see the world correctly, you can't plan and move and actuate to the world correctly," said Missy Cummings, a professor of engineering and computing at George Mason University. "Cars can't do it with vision only," she said. Even those with laser and radar, Cummings said, can't always drive reliably yet, raising safety questions about Waymo and Cruise. (Representatives for Waymo and Cruise declined to comment.) Phil Koopman, a professor at Carnegie Mellon University who studies autonomous vehicle safety, said it will be many years before autonomous vehicles that operate solely on artificial intelligence will be able to handle all real-world situations. "Machine learning has no common sense and learns narrowly from a huge number of examples," Koopman said. "If the computer driver gets into a situation it has not been taught about, it is prone to crashing." Last April in Snohomish County, Washington, near Seattle, a Tesla using Full Self-Driving hit and killed a motorcyclist, authorities said. The Tesla driver, who has not yet been charged, told authorities that he was using Full Self-Driving while looking at his phone when the car rear-ended the motorcyclist. The motorcyclist was pronounced dead at the scene, authorities reported. The agency said it's evaluating information on the fatal crash from Tesla and law enforcement officials. It also says it's aware of Stein's experience with Full Self-Driving. NHTSA also noted that it's investigating whether a Tesla recall earlier this year, which was intended to bolster its automated vehicle driver monitoring system, actually succeeded. It also pushed Tesla to recall Full Self-Driving in 2023 because, in "certain rare circumstances," the agency said, it can disobey some traffic laws, raising the risk of a crash. (The agency declined to say if it has finished evaluating whether the recall accomplished its mission.) As Tesla electric vehicle sales have faltered for the past several months despite price cuts, Musk has told investors that they should view the company more as a robotics and artificial intelligence business than a car company. Yet Tesla has been working on Full Self-Driving since at least 2015. "I recommend anyone who doesn't believe that Tesla will solve vehicle autonomy should not hold Tesla stock," he said during an earnings conference call last month. Stein told investors, though, they should determine for themselves whether Full Self-Driving, Tesla's artificial intelligence project "with the most history, that's generating current revenue, and is being used in the real world already, actually works."
Share
Share
Copy Link
Tesla's Full Self-Driving (FSD) technology is under scrutiny as safety concerns mount and doubts arise about its launch schedule. Recent analysis casts doubt on the system's readiness for widespread deployment.
Tesla's Full Self-Driving (FSD) technology, a key feature of its electric vehicles, is facing increasing scrutiny over safety concerns. The National Highway Traffic Safety Administration (NHTSA) has received numerous complaints about Tesla vehicles equipped with FSD suddenly braking for no apparent reason, a phenomenon known as "phantom braking" 1. This issue has raised alarms about the reliability and safety of the autonomous driving system.
The NHTSA has intensified its investigation into Tesla's Autopilot system, which forms the foundation for FSD. The agency is examining multiple crashes involving Tesla vehicles operating with Autopilot engaged, including incidents where the cars collided with emergency vehicles 1. This increased regulatory scrutiny could potentially impact the future development and deployment of FSD technology.
Despite CEO Elon Musk's repeated promises of imminent FSD deployment, recent analysis suggests that the technology may not be ready for widespread launch. A study conducted by Citi analyst Jeff Chung indicates that Tesla might struggle to achieve Level 4 or Level 5 autonomy in the near future 2. This casts doubt on the company's ability to meet its ambitious self-driving goals within the projected timeframe.
The development of fully autonomous driving technology has proven to be more complex than initially anticipated. Tesla's approach, which relies heavily on camera-based systems rather than LiDAR technology used by some competitors, has faced criticism from industry experts. The company's decision to remove radar from its vehicles in favor of a "pure vision" approach has also raised questions about the system's ability to accurately perceive its environment in all conditions 2.
The uncertainty surrounding Tesla's FSD technology could have significant financial implications for the company. Tesla has been selling FSD as an add-on feature, with prices reaching up to $15,000. If the technology fails to materialize as promised, it could lead to customer dissatisfaction and potential legal challenges. Additionally, the success of FSD is crucial for Tesla's long-term strategy and market valuation 2.
The challenges faced by Tesla in developing and deploying FSD technology reflect broader issues in the autonomous vehicle industry. As other automakers and tech companies also work on self-driving systems, the hurdles encountered by Tesla serve as a cautionary tale about the complexities of achieving true autonomy on public roads. The outcome of Tesla's FSD efforts could have far-reaching implications for the future of autonomous driving and transportation as a whole.
Reference
[2]
Elon Musk's close ties with President-elect Donald Trump could potentially ease regulatory hurdles for Tesla's autonomous vehicle ambitions, but significant technological and legal challenges remain.
2 Sources
2 Sources
Tesla's Q2 earnings report reveals challenges in the EV market, with Elon Musk addressing concerns about Full Self-Driving, robotaxis, and critical materials. The company's future strategy focuses on cost reduction and diversification.
7 Sources
7 Sources
A Tesla Cybertruck crash in self-driving mode has sparked debates about the safety and readiness of autonomous vehicle technology, just as Tesla plans to launch a robotaxi service.
2 Sources
2 Sources
Tesla's Q4 earnings call reveals a strategic pivot towards AI and robotics, as investors show more interest in future technologies than current car sales, despite the company's first-ever annual sales decline.
2 Sources
2 Sources
AI is revolutionizing the automotive industry, from enhancing in-car experiences to advancing autonomous driving technologies. This story explores the current state and future prospects of AI in vehicles, including robotaxis, advanced driver assistance systems, and the challenges faced by the industry.
3 Sources
3 Sources