Most automakers offer some form of advanced driver assistance features, including automatic lane keeping on highways and radar cruise control. Tesla’s Autopilot, for all its faults, is one of the most advanced available. When enabled on highways, the car will handle most of the driving, but the driver still needs to be alert and ready to take over should an unexpected situation come up. For example, multiple people have died in separate incidents where Teslas slammed into 18-wheelers crossing highways that, for whatever reason, Autopilot did not properly recognize.
After an investigation into Autopilot following these incidents, Senator Ed Markey from Massachusetts praised Tesla for popularizing electric vehicles, but called Autopilot “a flawed system.” He recommended two fixes: rebrand and remarket Autopilot as something that sounds less, well, Autopilot-y, and implement “backup driver monitoring tools” to prevent misuse.
Technically, Tesla does have a driver monitoring system that is supposed to disengage Autopilot if a driver isn't paying attention, but it is easily circumvented. This “misuse” can take the form of drivers lodging water bottles in or hanging weights on the steering wheel to fool the car into thinking the driver is holding the wheel.
In a response to a series of questions from Markey, the company says, as it always has, that none of this is its fault or problem. “First, throughout the purchase, user and ownership experience, Tesla clearly explains that Autopilot is not an autonomous system and does not make our vehicles autonomous,” the company wrote. “When used properly, Autopilot can greatly enhance occupant safety, but, as an SAE L2 ADAS [Society of Automotive Engineers Level 2 Advanced Driver Assistance System], the driver is ultimately responsible for the safe operation of his vehicle.”
Tesla’s response continues the company’s tradition of responding to official government safety inquiries with language far clearer and more precise than its marketing copy. If you go to Tesla’s website, you will see “Full Self-Driving Capability” in giant header font. In the paragraph below, the company promises “All new Tesla cars have the hardware needed in the future for full self-driving in almost all circumstances. The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat.”
When buying a car through the website’s order page, the section on Autopilot reads: “Autopilot enables your car to steer, accelerate and brake automatically within its lane. Full Self-Driving Capability introduces additional features and improves existing functionality to make your car more capable over time” including auto-lane changes and navigation. The website does not clearly state that “Full Self-Driving Capability” is in fact not full self-driving capability.
In the most basic sense, selling a software package called “Full Self-Driving” undermines Tesla’s argument that it “clearly explains that Autopilot is not an autonomous system.” Moreover, calling it "Autopilot" at all is a marketing and sales strategy based on winks and nods that has been a contributing factor in multiple deaths.
It is also a stark contrast from the approach other auto companies have taken. The closest comparison to Autopilot is Cadillac’s Super Cruise, which can be enabled on most major U.S. highways. The company markets it as “hands free driving.” Super Cruise features a camera mounted on the steering wheel facing the driver. If the driver looks away from the road for more than a few seconds, the car starts beeping, the seat vibrates, and ultimately the car will flash its hazards and pull over if the driver doesn’t regain control. Tesla’s Autopilot does something similar if it doesn’t detect a hand (or weight or orange or whatever) for a similar amount of time, but unlike Autopilot, there’s no proven, obvious way to fool Super Cruise.
Cadillac's system is a stark contrast to Tesla, which has known for years Autopilot can be fooled and hasn’t seemed to care enough to fix it. The company told Markey that it’s aware of the online videos that demonstrate drivers sleeping at the wheel and other dangerous maneuvers that clearly violate the company’s rules, but added “We believe that many of these videos are fake and intended to capture media attention.”
To be fair to Tesla, it has some experience here. Back in 2018, Musk gave a demonstration of Autopilot on 60 Minutes and took his hands off the wheel for a prolonged period, one of the very safety violations Tesla warns drivers not to do in their Autopilot instructions. Breaking rules to capture media attention is something Tesla knows plenty about.
from VICE https://ift.tt/38IsTZd
via cheap web hosting
No comments:
Post a Comment