[ad_1]
Authorities said that Tesla, which had a fatal car accident on a highway in Southern California last week, was using autopilot at the time.
The National Highway Traffic Safety Administration (NHTSA) is investigating a car accident in Fontana, 80 kilometers (50 miles) east of Los Angeles on May 5. The investigation is the 29th case involving Tesla that the agency has responded to.
A 35-year-old man was killed when he hit an overturned semi truck on the highway at 2:30 am local time (09:30 GMT). The name of the driver has not been made public. When the electric car hit him, another man was seriously injured while he was helping the driver of the semi-autonomous car to escape from the sunken ship.
The California Highway Patrol (CHP) announced on Thursday that the car has been running Tesla’s Autopilot, part of the autopilot system, which has been involved in multiple car accidents. The Fontana crash marks at least the fourth time in the United States involving autopilot deaths.
The agency said in a statement: “Although the CHP center does not usually comment on ongoing investigations, the Department of Defense recognizes that there is a high level of concern surrounding crashes involving Tesla vehicles.” “We think. This information provides an opportunity to remind the public that driving is a complex task that requires the driver’s full attention.”
The federal security investigation happened just after the Center for Health Protection arrested another man. Authorities said the man was sitting in the back seat of Tesla. Tesla was driving on Interstate 80 near Oakland this week, and no one was driving.
CHP has not disclosed whether officials have determined whether the Tesla in the I-80 incident was running on Autopilot, a feature that allows the car to be centered in the lane and maintain a safe distance behind the vehicle in front of it.
But it is likely that the driver is operating the autopilot or “fully autonomous driving” in the back seat. Tesla allows a limited number of car owners to test its autonomous driving system.
Tesla, which has disbanded its public relations department, did not respond to an email seeking comment on Friday. The company stated in the owner’s manual and its website that neither autonomous driving nor “fully autonomous driving” is completely autonomous, and drivers must be aware of and be prepared to intervene at any time.
Sometimes, the autopilot has trouble handling fixed objects and the intersection in front of Tesla.
In two Florida crashes in 2016 and 2019, a car using an autopilot drove under a crossover tractor trailer and killed the person driving the Tesla. In a 2018 crash in Mountain View, California, an Apple engineer driving an autopilot was killed when Tesla hit a highway guardrail.
Tesla’s system uses cameras, radar, and short-range sonar, and it also has trouble handling emergency vehicles that have stopped. Tesla has hit several fire trucks and police cars, which are parked on the highway with emergency lights flashing.
For example, the National Highway Traffic Safety Administration (NHTSA) sent a team to investigate after a Tesla car on autopilot crashed into a Michigan police car on Interstate 96 near Lansing. The police said that neither the soldier nor the 22-year-old Tesla driver was injured.
After fatal traffic accidents in Florida and California, the National Transportation Safety Board (NTSB) recommended that Tesla develop a more powerful system to ensure driver attention and restrict autopilots on highways that can work effectively use. Neither Tesla nor the safety agency took any action.
NTSB Chairman Robert Sumwalt (Robert Sumwalt) in a letter to the U.S. Department of Transportation on February 1, urging the department to formulate regulations to supervise driver assistance systems such as autonomous driving and testing of autonomous vehicles. The National Highway Traffic Safety Administration (NHTSA) mainly relies on voluntary vehicle guidelines and adopts a non-intervention approach, so it will not hinder the development of new safety technologies.
Sam Walter said that Tesla is using car buyers to test “self-driving” software on public roads with limited supervision or reporting requirements.
“Since NHTSA does not require it, manufacturers can operate and test vehicles almost anywhere, even if the location exceeds the AV [autonomous vehicle] Limitations of the control system,” Sumwalt wrote.
He added: “Although Tesla includes a disclaimer, “Currently enabled features require active driver supervision and cannot make the vehicle self-driving. However, NHTSA’s self-driving method cannot supervise audio-visual testing, which is important for drivers and other roads The user constitutes a potential risk. “
NHTSA has the right to adjust the autopilot system and seek a recall if necessary. Since U.S. President Bibi took office, NHTSA seems to have developed new interest in the system.
[ad_2]
Source link