Tesla’s claims about Autopilot and Full Self-Driving are under scrutiny as senators pressure FTC

Business

Electrek.co 18 August, 2021 - 03:18pm 29 views

When is Tesla Ai day?

Will go over progress with Tesla AI software & hardware, both training & inference. Purpose is recruiting.” On July 29, CEO Elon Musk announced on Twitter that Tesla would host an A.I. Day on August 19. InverseTesla A.I. Day: What to know and livestream info for Elon Musk's huge event

Federal Investigation into Tesla Autopilot Defects Could Pull 765k Cars From U.S. Roads

Streetsblog New York 18 August, 2021 - 07:20pm

Following 11 incidents in the U.S. in which both Tesla’s advanced driver assistance systems and their human backstop drivers failed to prevent crashes into emergency services vehicles, the National Highway Traffic Safety Administration’s Office of Defects Investigation announced on Monday that it will examine whether the company’s so-called “Autopilot” and “Traffic Aware Cruise Control” features are too faulty to be allowed on U.S. roads. In most of the crashes, the struck vehicles in question were parked and surrounded by “scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones” that should have made the vehicles easily visible to Tesla’s sensors, the agency noted.

If the investigation results in a recall, an estimated 765,000 cars could be pulled from U.S. roads — the majority of the vehicles ever produced by the company.

The news was celebrated by safe streets advocates, many of whom say the company knowingly overstates claims about the reliability of its semi-autonomous driving technology. Despite its deceptive name, Teslas equipped with Autopilot are not, in fact, self-driving cars, but rather standard vehicles equipped with a range of driver assistance systems that are theoretically capable of automatic emergency braking, adaptive cruise control, and automatic steering on clearly marked highways, among other features. But in practice, these systems can fail while the cars’ owners are (sometimes literally) asleep at the wheel — and the car’s built-in driver monitoring systems, which do not include recommended measures like driver eye monitoring systems, are easily tricked.

“[This news is] long overdue,” said David Zipper, a transportation researcher and visiting Fellow at Harvard Kennedy School’s Taubman Center for State and Local Government. “Basically, Tesla’s made a series of strategic decisions to cut corners on safety in order to create advantages for the company. There’s a lot of skepticism of autonomous driving in general, but this company is  pretty exceptional in its recklessness. Nobody else is naming something like this ‘autopilot’ [while also remaining] so resistant to installing driver monitoring systems to make sure drivers use it responsibly.”

The collisions that sparked NHTSA’s investigation aren’t the only Tesla crashes that have caught the attention of safety hawks.

The company was hit with a lawsuit following a 2018 Tokyo crash that marked the first time a driver relying on Autopilot technology killed a pedestrian on a public road; a handful more have died in U.S. crashes since, subjecting the automaker to further legal action. Tesla dissolved its press relations department in 2020 and could not be reached for comment on this story, but the owner’s manual it issues to customers repeatedly insists that human drivers are ultimately responsible for complying with life-saving local traffic laws.

But critics say that a few written warnings to drivers in a more-than-200-page booklet aren’t nearly enough — especially because Tesla could automatically restrict the use of the error-prone tech to relatively predictable (and pedestrian-free) environments like divided highways, like other automakers who sell advanced driver assistance-equipped cars already do.

Tesla has made some upgrades to its safety systems over the years, but some advocates think the company hasn’t done enough — and that the regulators that oversee semi-autonomous vehicle manufacturers in general haven’t, either.

Following a 2017 crash in Williston, Fla. involving a Tesla driver who was warned seven times to put his hands back on the wheel without his vehicle being brought to a safe stop by Autopilot, the National Transportation Safety Board encouraged NHTSA to “develop a method to verify that manufacturers [of such systems] incorporate safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed.”

Despite a contentious history with the safety organization — Tesla CEO Musk reportedly hung up on then-NTSB chair Robert Sumwalt during a heated conversation about another 2018 probe into another crash involving one of his company’s cars — the automaker did add new features to curb hands-off driving. But NHTSA has yet to update federal policies to require all companies to take such safety measures in the future — something the board hopes could finally happen now.

“Today’s action by NHTSA is a positive step forward for safety,” said NTSB Chairwoman Jennifer Homendy in a release. “As we navigate the emerging world of advanced driving assistance systems, it’s important that NHTSA has insight into what these vehicles can, and cannot, do.”

Clip from China of a $TSLA Model 3, evidently equipped with the latest and greatest safety technology. My guess is the 4D dojo supercomputer recognizes that the pedestrian dummy is not a human and therefore sees no reason to perform evasive maneuvers. pic.twitter.com/R1mQqbxIy9

— degen (@finance_degen) September 20, 2020

A Tesla recall could have an impact on the company’s progress towards more advanced autonomous driving tech, too.

In a decision that was widely decried by all but the most fervent AV-supporters, the company recently announced its intentions to release the latest edition of its “Full Self Driving” technology to a select group of its drivers for beta testing on neighborhood roads throughout the U.S., defying industry norms like testing new tech on private tracks, or at least using trained drivers when cars must be tested in the public right of way. Much like vehicles running on Auto-Pilot, Teslas equipped with “Full Self Driving” mode are not autonomous; they just offer additional features like automatic parking, along with still-nascent technology that the company says will someday identify “stop signs and traffic lights and automatically slows your car to a stop on approach, with [drivers’] active supervision.” (Emphasis ours.)

Musk has cautioned his beta-testers to “please be paranoid” on the roads, but advocates say that polite request is far less than the participants in his experiment deserve — especially because virtually all of the pedestrians, cyclists, wheelchair users, and other drivers involved are completely unwitting.

“Tesla defenders tend to say, [programs like] ‘Autopilot may not be perfect, but drivers accept the risk as they drive,'” adds Zipper. “Well, that may be true, but the people outside the car didn’t.”

Running preproduction software is both work & fun. Beta list was in stasis, as we had many known issues to fix.

Beta 9 addresses most known issues, but there will be unknown issues, so please be paranoid.

Safety is always top priority at Tesla.

— Elon Musk (@elonmusk) July 9, 2021

If NHTSA’s investigation fails to rein in Musk’s recklessness, advocates worry it may be challenging to exert control over the rest of the fast-changing industry. Most proponents of autonomous vehicles not employed by Tesla say they’re happy to keep the real self-driving technology off neighborhood roads until it’s certified safe, but verbal commitments from the private sector don’t carry the force of a government mandate. The bipartisan infrastructure bill currently being debated in Congress largely ignored the autonomous vehicle revolution, besides making the industry eligible for new research and development grants.

Advocates are hopeful that NHTSA’s new investigation is a sign that they’re finally stepping up to the plate.

“I don’t know why we need Congress to get NHTSA to do its job,” added Zipper. “The NTSB has been basically jumping up and down screaming that we need to do something about this Tesla for years. Maybe now they will. … And maybe someday, they’ll expand their capacity to do more investigations like this.”

Filed Under: Autonomous cars, Elon Musk, National Transportation Safety Board, NHTSA, Self-driving cars, Tesla, Promoted

Business Stories