DETROIT — A tweet from Elon Musk indicating that Tesla would possibly enable some homeowners who’re testing a “Full Self-Driving” system to disable an alert that reminds them to maintain their palms on the steering wheel has drawn consideration from U.S. security regulators.
The National Highway Traffic Safety Administration says it requested Tesla for extra details about the tweet. Last week, the company stated the problem is now a part of a broader investigation into not less than 14 Teslas which have crashed into emergency autos whereas utilizing the Autopilot driver help system.
Since 2021, Tesla has been beta-testing “Full Self-Driving” utilizing homeowners who haven’t been educated on the system however are actively monitored by the corporate. Earlier this 12 months, Tesla stated 160,000, roughly 15% of Teslas now on U.S. roads, had been collaborating. A wider distribution of the software program was to be rolled out late in 2022.
Despite the title, Tesla nonetheless says on its web site that the vehicles can’t drive themselves. Teslas utilizing “Full Self-Driving” can navigate roads themselves in lots of instances, however specialists say the system could make errors. “We’re not saying it’s quite ready to have no one behind the wheel,” CEO Musk stated in October.
On New Year’s Eve, certainly one of Musk’s most ardent followers posted on Twitter that drivers with greater than 10,000 miles of “Full Self-Driving” testing ought to have the choice to show off the “steering wheel nag,” an alert that tells drivers to maintain palms on the wheel.
Musk replied: “Agreed, update coming in Jan.”
It’s not clear from the tweets precisely what Tesla will do. But disabling a driver monitoring system on any automobile that automates pace and steering would pose a hazard to different drivers on the street, stated Jake Fisher, senior director of auto testing for Consumer Reports.
“Using FSD beta, you’re kind of part of an experiment,” Fisher stated. “The problem is the other road users adjacent to you haven’t signed up to be part of that experiment.”
Tesla didn’t reply to a message in search of remark in regards to the tweet or its driver monitoring.
Auto security advocates and authorities investigators have lengthy criticized Tesla’s monitoring system as insufficient. Three years in the past the National Transportation Safety Board listed poor monitoring as a contributing think about a 2018 deadly Tesla crash in California. The board really helpful a greater system, however stated Tesla has not responded.
Philip Koopman, a professor {of electrical} and laptop engineering at Carnegie Mellon University, argued that Tesla is contradicting itself in a method that might confuse drivers. “They’re trying to make customers happy by taking their hands off the wheel, even while the (owners) manual says ‘don’t do that.’ ”
Indeed, Tesla’s web site says Autopilot and the extra subtle “Full Self-Driving” system are supposed to be used by a “fully attentive driver who has their hands on the wheel and is prepared to take over at any moment.” It says the techniques are usually not absolutely autonomous.
Source: www.bostonherald.com”