Tesla is forcing us to ask whether cars should drive like humans or like robots

Many motorists already fail to recognize the limits of technology in road traffic. The public is confused about what “self-driving driving” means, for example, as driver assistance systems become more common and sophisticated. in one opinion poll last year by analyst firm JD Power, only 37 percent of respondents chose the correct definition of self-driving cars.

Neither Tesla nor any other company sells a self-driving or autonomous vehicle, Vehicle capable of self-driving in a variety of locations and circumstances without a human being ready to take over.

Despite this, Tesla markets its driver assistance systems in the US with names that regulators and safety experts say are what they are misleading like autopilot for the standard package and full self-driving for the premium package.

At the same time, Tesla warns drivers in the manuals that it is their responsibility to use the features safely and that they must be prepared to assume the driving task with their eyes on the road and hands on the wheel at all times.

The difficulty of navigating an unpredictable environment is one of the reasons there aren’t truly self-driving cars yet.

“An autonomous vehicle has to be better and more agile than the driver it replaces, not worse,” said William S. Lerner, transportation safety expert and delegate of the International Organization for Standardization, a group that sets global industry standards.

“I wish we were there already, but we’re not, apart from straight freeways with typical on and off ramps that have been mapped,” he said.

“Caught in the Cookie Jar”

Tesla’s roll stop feature existed for months before it attracted much attention. Chris, who chronicles the good and bad of Tesla’s latest features on YouTube under the name Dirty Tesla, said his Tesla performed automatic roll stops for over a year before Tesla disabled the feature. He agreed to an interview on the condition that only his first name will be used for privacy reasons.

The exam started this year. Regulators at the National Highway Traffic Safety Administration asked Tesla about the feature, and in January the automaker initiated an “over-the-air” software update to disable it. NHTSA classified the software update as official safety recall.

Critics were surprised not only by the decision to develop software this way, but also by Tesla’s decision to test the features with customers rather than professional test drivers.

Safety advocates said they were not aware of any US jurisdiction where rolling stops were legal and could find no safety reasons for allowing them.

“They’re breaking the letter of the law in a very transparent way, and that completely erodes the trust they’re trying to garner from the public,” said William Widen, a University of Miami law professor who has written on autonomy auto regulation.

“I would be open about it,” Widen said, “as opposed to having her hand caught in the cookie jar.”

Safety advocates also questioned two entertainment features unrelated to autonomous driving that they say circumvent safety laws. One called Passenger Play allowed drivers to play video games while driving. Another, called a boombox, allowed drivers to blast music or other audio out of their cars while driving, a possible danger for pedestriansincluding blind people.

Tesla recently pushed software updates to the restriction both of these featuresand NHTSA opened an investigation into the passenger game.

Tesla, the top-selling EV maker, hasn’t called the features bugs or acknowledged they might have created safety risks. Instead, Musk denied that rolling stops could be unsafe and called federal motor vehicle security officials.the fun police” for objecting to Boombox.

Separately, the NHTSA is investigating Tesla for possible safety deficiencies in Autopilot, its default driver assistance system, following a series of accidents in which Tesla vehicles with activated systems crashed into stationary first responder vehicles. Tesla has faced lawsuits and allegations that Autopilot is unsafe because of it cannot always recognize other vehicles or obstacles on the road. Tesla has generally denied claims made in court cases, including in a case in Florida where it called in court documents that the driver was at fault in a pedestrian death.

NHTSA declined an interview request.

It’s not clear what state or local regulators might do to adjust to the reality Tesla is trying to create.

“All vehicles operated on California’s public highways are expected to comply with the California Vehicle Code and local traffic laws,” the California Department of Motor Vehicles said in a statement.

The agency added that automated vehicle technology should be deployed in a way that both “encourages innovation” and “addresses public safety” — two goals that can be at odds if innovation means intentionally breaking traffic rules. Officials there declined a request for an interview.

Musk, like most proponents of self-driving technology, has focused on the number of deaths caused by current human-powered vehicles. He said his priority is to bring about a self-driving future as soon as possible in a theoretical offer to reduce the 1.35 million annual traffic deaths worldwide. However, there is no way to measure how safe a truly self-driving vehicle would be, and even comparing Teslas to other vehicles is difficult due to factors such as vehicle age differences.

Industry commitments

At least one other company has faced charges of intentionally violating traffic rules, but with a different result than Tesla.

Last year, San Francisco city officials expressed concern that Cruise, which is majority-owned by General Motors, had programmed its vehicles to stop in lanes, in violation of California vehicle laws. Cruise’s driverless development vehicles will be used in a robo-taxi service that will pick up and drop off passengers without a driver at the wheel.

Cruise responded with something Tesla hasn’t yet offered: a promise to obey the law.

“Our vehicles are programmed to comply with all traffic laws and regulations,” Cruise spokesman Aaron Mclear said in a statement.

Waymo, another company pursuing self-driving technology, has programmed its cars to only break traffic rules when they conflict with one another, such as crossing a double yellow line to give a cyclist more space, said Waymo spokeswoman Julianne McGoldrick.

“We prioritize safety and compliance with traffic rules over familiarity with behavior for other drivers. For example, we don’t program the vehicle to exceed the speed limit because other drivers know that,” she said in a statement.

A third company, Mercedes, said it did ready to be held accountable for accidents that occur in situations where they have promised that their Drive Pilot driver assistance system will be safe and comply with traffic laws.

Mercedes did not respond to a request for information about its approach to automated vehicles and whether they should ever circumvent traffic laws.

Security experts are unwilling to give Tesla or anyone else a pass to break the law.

“At a time Pedestrian fatalities are at a 40-year highwe shouldn’t relax the rules,” said Leah Shahum, director of the Vision Zero Network, an organization that tries to eliminate road deaths in the United States

“We have to think about higher goals – not to have a system that is no worse than it is today. It should be dramatically better,” Shahum said.

Leave a Comment