How safe are systems like Tesla’s Autopilot? Nobody knows.

66

Tesla releases one every three months security report That tells the number of miles between accidents when drivers use the company’s driver assistance system, Autopilot, and the number of miles between accidents when they don’t.

These numbers consistently show that accidents are less common with Autopilot, a collection of technologies that allow Tesla vehicles to steer, brake, and accelerate on their own.

But the numbers are misleading. Autopilot is mainly used for highway driving, which is generally the case twice as safe as driving on city streets, according to the Ministry of Transport. There may be fewer crashes with autopilot just because it’s typically used in safer situations.

Tesla has not provided data that would allow comparison of Autopilot safety on the same types of roads. Neither have other automakers offering similar systems.

Autopilot has been on public roads since 2015. General Motors introduced Super Cruise in 2017 and Ford Motor released BlueCruise last year. But publicly available data that reliably measures the safety of these technologies is scarce. American drivers – whether they use these systems or share the road with them – are effectively guinea pigs in an experiment whose results have not yet been announced.

Automakers and tech companies are adding more vehicle features that they claim improve safety, but it’s difficult to verify those claims. While deaths on the country’s highways and roads have increased in recent years, hit a 16-year high in 2021. It seems that any additional security offered by technological advances does not make up for bad decisions made by drivers behind the wheel.

“There is a lack of data to give the public confidence that these systems, as deployed, will deliver their anticipated safety benefits,” said J. Christian Gerdes, professor of mechanical engineering and co-director of the Center for at Stanford University Automotive Research, who was the Department of Transportation’s first Chief Innovation Officer.

GM worked with the University of Michigan on a study examining the potential safety benefits of Super Cruise, but concluded they didn’t have enough data to understand whether the system reduced accidents.

A year ago, the National Highway Traffic Safety Administration, the government agency for auto safety, instructed companies to report potentially serious accidents involving advanced autopilot-type driver assistance systems within one day of discovery. The order said the agency would release the reports, but it hasn’t done so yet.

The safety agency declined to comment on the information collected so far, but said in a statement the data would be released “in the near future”.

Tesla and its CEO Elon Musk did not respond to requests for comment. GM said it reported two Super Cruise incidents to NHTSA: one in 2018 and one in 2020. Ford declined to comment.

The agency’s data likely won’t provide a full picture of the situation, but it could encourage lawmakers and drivers to take a much closer look at these technologies and ultimately change the way they’re marketed and regulated.

“To solve a problem, you first have to understand it,” says Bryant Walker Smith, an associate professor at the University of South Carolina Schools of Law and Engineering who specializes in emerging transportation technologies. “This is a way to get more ground truth as a basis for investigation, regulation and other action.”

Despite its capabilities, the autopilot does not relieve the driver of responsibility. Tesla urges drivers to remain vigilant and ready to take control of the car at all times. The same applies to BlueCruise and Super Cruise.

However, many experts worry that because these systems allow the driver to relinquish active control of the car, they could trick them into believing their car is driving itself. Then, when technology fails or cannot handle a situation on its own, drivers may not be prepared to take control as quickly as necessary.

Legacy technologies such as automatic emergency braking and lane departure warning have long provided safety nets for drivers, slowing or stopping the car or alerting drivers when they deviate from their lane. But newer driver assistance systems reverse this arrangement, making the driver the safety net for the technology.

Security experts are particularly concerned about Autopilot because of the way it’s being marketed. Mr. Musk has said for years that the company’s cars are on the cusp of true autonomy — they drive themselves in virtually every situation. The system’s name also implies an automation that technology has yet to match.

This can lead to driver complacency. Autopilot has played a role in many fatal accidents, in some cases because drivers were unwilling to take control of the car.

Mr. Musk has long promoted autopilot as a way to improve safety, and Tesla’s quarterly safety reports appear to support him. But one Recent study by the Virginia Transportation Research Council, a branch of the Virginia Department of Transportation, shows that these reports are not what they appear to be.

“We know that cars with autopilot are less likely to crash than those without,” said Noah Goodall, a Council researcher who studies safety and operational issues surrounding autonomous vehicles. “But are they driven the same way, on the same roads, at the same time of day, and by the same drivers?”

Analyzing police and insurance data, the Insurance Institute for Highway Safety, a nonprofit research organization funded by the insurance industry, found that older technologies like automatic emergency braking and lane departure warning improved safety. But the organization says studies have yet to show driver-assistance systems provide similar benefits.

Part of the problem is that police and insurance data don’t always indicate whether these systems were being used at the time of an accident.

The federal auto safety agency has instructed companies to provide data on accidents that used driver-assistance technologies within 30 seconds of impact. This could provide a broader picture of how these systems work.

But even with this data, security experts say it will be difficult to determine whether using these systems is safer than turning them off in the same situations.

The Alliance for Automotive Innovation, a trade group for car companies, has warned that the federal safety agency’s data could be misconstrued or misrepresented. Some independent experts express similar concerns.

“My big concern is that we’re going to have detailed data on accidents involving these technologies without comparable data on accidents involving conventional cars,” said Matthew Wansley, a professor at New York’s Cardozo School of Law who specializes in emerging automotive technologies and previously was General Counsel at an autonomous vehicle start-up called nuTonomy. “It could potentially appear that these systems are much less secure than they really are.”

For this and other reasons, automakers may be reluctant to share some data with the agency. Under his order, companies can demand that certain data be withheld by claiming that they are leaking trade secrets.

The agency also collects crash data through automated driving systems — more advanced technologies aimed at removing drivers from cars entirely. These systems are often referred to as “self-driving cars”.

For the most part, this technology is still being tested in a relatively small number of cars with drivers behind the wheel as a backup. Waymo, a company of Google’s parent company Alphabet, operates a driver-free service in the Phoenix suburbs, and similar services are planned in cities like San Francisco and Miami.

In some states, companies are already required to report accidents involving automated driving systems. Here, too, the nationwide data from the Federal Office for Security should provide additional information.

The more immediate concern, however, is the safety of autopilots and other driver assistance systems installed in hundreds of thousands of vehicles.

“There is an open question: does autopilot increase accident rates or decrease them?” said Mr. Wansley. “We may not get a full answer, but we will get some useful information.”



Source link

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.