How Will Tesla's Auto Pilot Safety Probe Affect The EV Maker And The Industry? - Benzinga

Business

Benzinga 19 August, 2021 - 09:30am 26 views

What time is Tesla AI event?

What is Tesla AI Day? Mr. Musk has billed it as a recruiting event to spotlight the work the company is doing on artificial intelligence. It is scheduled to take place on Thursday in Palo Alto, Calif., at 5 p.m. local time. Tesla will live stream the gathering, Mr. Musk said Tuesday. The Wall Street JournalTesla to Hold AI Day at a Time of New Scrutiny: What to Expect

Why drivers are zoning out behind the wheel

Axios 19 August, 2021 - 07:00pm

If you haven't bought a new car in a few years, you might be surprised at how many driving tasks are now automated — speed control, braking, lane-keeping and even changing lanes.

Why it matters: Carmakers keep adding more automated features in the name of safety. But now authorities want to find out if assisted-driving technology itself is dangerous by making it too easy for people to misuse.

Context: Federal regulators have taken a mostly hands-off approach to automated vehicle technologies, offering only guidelines for fully driverless cars like robotaxis, which are under development and evolving.

What's happening: The National Highway Traffic Safety Administration said recently that companies must report serious crashes involving driver-assistance and automated-driving systems to authorities within a day of learning about them.

Between the lines: While the focus on crashes with emergency vehicles is fairly narrow, NHTSA will be looking carefully at where and how Autopilot functions, including how it identifies and reacts to obstacles in the road.

Be smart: Tesla Autopilot is not an autonomous driving system. It is an advanced driver assistance system (ADAS) that allows the car to maintain its speed and stay in its lane.

What to watch: NHTSA will consider whether there is a defect in Tesla's Autopilot system due to a "foreseeable misuse" of the technology and whether all of its 765,000 affected cars should be recalled.

The bottom line: Authorities are reviewing not just whether assisted-diving technology works, but also its effects on human behavior.

The National Highway Traffic Safety Administration has opened a formal investigation into Tesla's Autopilot function after a series of crashes involving emergency vehicles.

The big picture: The probe will cover all of Tesla's current models, an estimated 765,000 vehicles. The agency has identified 11 crashes since 2018, where Tesla vehicles on Autopilot struck first responders who had used flashing lights, flares or road cones. At least 17 people were injured and one person died in the crashes, according to NHTSA.

President Biden said he saw no way to withdraw U.S. troops from Afghanistan "without chaos ensuing" in an interview with ABC News' George Stephanopoulos that was previewed Wednesday evening.

Why it matters: Critics have slammed the Biden administration for failing to plan a measured and managed departure, which the Taliban used to their advantage. But in his first on-camera interview since the fall of Afghanistan, Biden defended the withdrawal, calling it "a simple choice."

US Investigates Crashes Involving Tesla Cars Using ‘Autopilot’

VOA Learning English 19 August, 2021 - 07:00pm

No media source currently available

The National Highway Traffic Safety Administration (NHTSA) said this week it had identified 11 crashes involving Tesla models that struck emergency vehicles that were parked on the road. Those accidents resulted in one death and 17 injuries.

Three of the 11 crashes took place in California, while others were in Florida, Texas, Massachusetts and other states. The four latest crashes happened in 2021, including one in July.

The NHTSA said it had confirmed that the Tesla cars involved were using either the company’s Autopilot or Traffic-Aware Cruise Control systems just before the crashes. Both systems use a series of cameras and sensors to assist drivers.

Tesla says its Autopilot system is designed to permit full self-driving capabilities. But the company says when using this tool, drivers should still keep their hands on the steering wheel and be ready to take control if the system fails to work.

When turned on, Autopilot requires drivers to agree to keep their hands on the steering wheel at all times. The system also includes sensors to identify pressure on the steering wheel and issues warnings if it thinks the driver is not holding the wheel.

The company’s Traffic-Aware Cruise Control is designed to keep the vehicle at similar speeds as surrounding traffic.

The NHTSA said most of the accidents took place after dark. But it noted that the parked emergency vehicles used safety measures, such as flashing lights, flares or road markers.

The agency said the investigation will cover 765,000 Tesla vehicles. That number includes nearly every vehicle the company has sold in the U.S. since the start of the 2014 model year.

The NHTSA announced it had begun its investigation in documents published online.

Tesla has not officially commented. Telsa chief Elon Musk has repeatedly defended the Autopilot system. In April, he tweeted that Tesla vehicles using Autopilot were “now approaching 10 times lower chance of accident than (an) average vehicle."

Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle https://t.co/6lGy52wVhC

The agency closed an earlier investigation into Autopilot in 2017 without taking any action. After its new investigation, the NHTSA could decide again to take no action. Or it could demand a vehicle recall, which could place limits on how, when and where Autopilot operates.

In a statement to the French press agency AFP, an NHTSA spokesperson said the agency wants the public to know that "no commercially available motor vehicles today are capable of driving themselves."

The spokesperson noted that driver assistance programs can improve safety, help drivers avoid accidents and reduce the severity of crashes. But the spokesperson added that, “as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly."

The National Transportation Safety Board (NTSB) has said Autopilot was operating in at least three deadly Tesla crashes in the U.S. since 2016. It has criticized Tesla's lack of system safeguards for Autopilot, and NHTSA's failure to take steps to ensure the safety of the system.

The chair of the NTSB, Jennifer Homendy, praised the NHTSA’s new investigation. She said the board has urged the agency to develop standards for driver assistance systems. It has also called for requirements for automakers to use “system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed."

Jason Levine, head of the nonprofit Center for Auto Safety, also welcomed the new investigation. He said the NHTSA had finally answered longstanding calls “to investigate Tesla for putting technology on the road that will be foreseeably misused in a way that is leading to crashes, injuries, and deaths.” Levine added that he thinks the investigation should be expanded to include crashes beyond those just involving emergency vehicles.

parkv. to leave a vehicle in a particular place for a period of time

steering – n. the controlling of a vehicle in one direction or another

flash – v. to shine brightly and suddenly

flare – n. a piece of safety equipment that produces a bright signal

approachv. to come close in distance or time

recall – v. to order the return of something

commercial – adj. relating to buying and selling things

automate –v. to use machines and computers instead of people to do something

foreseeable – adj. as far in the future as you can imagine

There are no comments in this forum. Be first and add one

Business Stories